ollama-mcp-bridge and mindbridge-mcp

These are complements: the bridge tool integrates MCP servers into Ollama's API, while the orchestration server provides unified LLM routing across multiple providers, and they can be chained together to route queries through Ollama to various MCP-integrated tools.

ollama-mcp-bridge
61
Established
mindbridge-mcp
42
Emerging
Maintenance 10/25
Adoption 8/25
Maturity 24/25
Community 19/25
Maintenance 2/25
Adoption 7/25
Maturity 16/25
Community 17/25
Stars: 67
Forks: 21
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 28
Forks: 8
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No risk flags
Stale 6m No Package No Dependents

About ollama-mcp-bridge

jonigl/ollama-mcp-bridge

Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots

This project helps developers integrate external functionalities, called 'tools,' into their local large language model (LLM) applications powered by Ollama. It acts as an intermediary, taking requests that include calls to various tools and passing them to Ollama, while seamlessly managing the tool's execution and response. Software developers building AI agents, custom chatbots, or other LLM-driven applications will find this useful for extending their models' capabilities.

AI application development LLM integration chatbot development tool-use AI local AI development

About mindbridge-mcp

pinkpixel-dev/mindbridge-mcp

MindBridge is an AI orchestration MCP server that lets any app talk to any LLM — OpenAI, Anthropic, DeepSeek, Ollama, and more — through a single unified API. Route queries, compare models, get second opinions, and build smarter multi-LLM workflows.

This tool helps developers and AI engineers manage different large language models (LLMs) from various providers like OpenAI, Anthropic, or even local models, through a single interface. You input your application's queries and API keys, and it routes them to the best-suited LLM, returning the model's response. It's designed for anyone building applications powered by multiple AI models.

AI-application-development LLM-orchestration multi-model-management agent-building AI-backend-engineering

Scores updated daily from GitHub, PyPI, and npm data. How scores work