mcp-client-for-ollama and zin-mcp-client
These are competitors: both provide TUI/client interfaces to connect local Ollama LLMs to MCP servers, with the first being more mature and feature-rich (agent mode, multi-server support, tool management) while the second is a simpler alternative for the same use case.
About mcp-client-for-ollama
jonigl/mcp-client-for-ollama
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
This interactive terminal application helps developers connect their local Large Language Models (LLMs) running on Ollama to external tools and services defined by the Model Context Protocol (MCP). It allows for real-time management of models, tools, and server connections, facilitating advanced LLM-driven automation and experimentation. Developers can input natural language prompts, and the system outputs responses, potentially enhanced by tool calls, all within a user-friendly text interface.
About zin-mcp-client
zinja-coder/zin-mcp-client
MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well
Implements a ReAct agent framework using LangChain to intelligently route tool invocations across multiple MCP servers via stdio transport, with support for local LLM reasoning through Ollama. Offers multiple interfaces—CLI, lightweight web UI, and Open Web UI plugin—enabling flexible deployment from personal development to integrated environments. Designed as a minimal, performant bridge prioritizing stdio-based MCP server compatibility over feature bloat.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work