ollama-mcp-bridge and mindbridge-mcp
These are complements: the bridge tool integrates MCP servers into Ollama's API, while the orchestration server provides unified LLM routing across multiple providers, and they can be chained together to route queries through Ollama to various MCP-integrated tools.
About ollama-mcp-bridge
jonigl/ollama-mcp-bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots
This project helps developers integrate external functionalities, called 'tools,' into their local large language model (LLM) applications powered by Ollama. It acts as an intermediary, taking requests that include calls to various tools and passing them to Ollama, while seamlessly managing the tool's execution and response. Software developers building AI agents, custom chatbots, or other LLM-driven applications will find this useful for extending their models' capabilities.
About mindbridge-mcp
pinkpixel-dev/mindbridge-mcp
MindBridge is an AI orchestration MCP server that lets any app talk to any LLM — OpenAI, Anthropic, DeepSeek, Ollama, and more — through a single unified API. Route queries, compare models, get second opinions, and build smarter multi-LLM workflows.
This tool helps developers and AI engineers manage different large language models (LLMs) from various providers like OpenAI, Anthropic, or even local models, through a single interface. You input your application's queries and API keys, and it routes them to the best-suited LLM, returning the model's response. It's designed for anyone building applications powered by multiple AI models.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work