mcp-llm and locallama-mcp

These two tools are complementary ecosystem siblings; `locallama-mcp` is a specialized implementation of an LLM routing MCP server, which aligns with the general purpose described for `mcp-llm` of providing LLMs access to other LLMs.

mcp-llm
51
Established
locallama-mcp
35
Emerging
Maintenance 0/25
Adoption 9/25
Maturity 25/25
Community 17/25
Maintenance 2/25
Adoption 7/25
Maturity 8/25
Community 18/25
Stars: 77
Forks: 14
Downloads:
Commits (30d): 0
Language: JavaScript
License: MIT
Stars: 41
Forks: 12
Downloads:
Commits (30d): 0
Language: TypeScript
License:
Stale 6m
No License Stale 6m No Package No Dependents

About mcp-llm

sammcj/mcp-llm

An MCP server that provides LLMs access to other LLMs

This tool helps developers quickly generate, document, and manage code using large language models. It takes descriptions of desired code or existing code snippets as input and outputs code, documentation, or answers to programming questions. It's designed for software developers who need assistance with coding tasks and prefer to integrate AI directly into their development workflow.

software-development code-generation developer-tools documentation-automation programming-assistant

About locallama-mcp

Heratiki/locallama-mcp

An MCP Server that works with Roo Code/Cline.Bot/Claude Desktop to optimize costs by intelligently routing coding tasks between local LLMs free APIs and paid APIs.

This tool helps developers reduce the cost of using large language models (LLMs) for coding tasks. It acts as a smart router, taking your coding requests and deciding whether to send them to a free, local LLM or a more expensive, cloud-based API. The output is optimized code generation at a lower overall cost, ideal for individual developers or small teams managing LLM expenses.

software-development LLM-ops developer-tools cost-optimization AI-coding-assistant

Scores updated daily from GitHub, PyPI, and npm data. How scores work