mcp-llm and locallama-mcp
These two tools are complementary ecosystem siblings; `locallama-mcp` is a specialized implementation of an LLM routing MCP server, which aligns with the general purpose described for `mcp-llm` of providing LLMs access to other LLMs.
About mcp-llm
sammcj/mcp-llm
An MCP server that provides LLMs access to other LLMs
This tool helps developers quickly generate, document, and manage code using large language models. It takes descriptions of desired code or existing code snippets as input and outputs code, documentation, or answers to programming questions. It's designed for software developers who need assistance with coding tasks and prefer to integrate AI directly into their development workflow.
About locallama-mcp
Heratiki/locallama-mcp
An MCP Server that works with Roo Code/Cline.Bot/Claude Desktop to optimize costs by intelligently routing coding tasks between local LLMs free APIs and paid APIs.
This tool helps developers reduce the cost of using large language models (LLMs) for coding tasks. It acts as a smart router, taking your coding requests and deciding whether to send them to a free, local LLM or a more expensive, cloud-based API. The output is optimized code generation at a lower overall cost, ideal for individual developers or small teams managing LLM expenses.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work