mcp-client-for-ollama and Local_MCP_Client

These are competitors offering different interfaces (terminal-based vs. web-based) to the same core functionality of connecting local LLMs via Ollama to MCP servers for tool execution.

mcp-client-for-ollama
67
Established
Local_MCP_Client
37
Emerging
Maintenance 13/25
Adoption 10/25
Maturity 24/25
Community 20/25
Maintenance 6/25
Adoption 4/25
Maturity 15/25
Community 12/25
Stars: 563
Forks: 82
Downloads:
Commits (30d): 1
Language: Python
License: MIT
Stars: 5
Forks: 1
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
No risk flags
No Package No Dependents

About mcp-client-for-ollama

jonigl/mcp-client-for-ollama

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

This interactive terminal application helps developers connect their local Large Language Models (LLMs) running on Ollama to external tools and services defined by the Model Context Protocol (MCP). It allows for real-time management of models, tools, and server connections, facilitating advanced LLM-driven automation and experimentation. Developers can input natural language prompts, and the system outputs responses, potentially enhanced by tool calls, all within a user-friendly text interface.

LLM-development tool-integration local-AI-models prompt-engineering AI-application-testing

About Local_MCP_Client

mytechnotalent/Local_MCP_Client

Local MCP Client is a cross-platform web and API interface for interacting with configurable MCP servers using natural language, powered by Ollama and any local LLM of choice, enabling structured tool execution and dynamic agent behavior.

Scores updated daily from GitHub, PyPI, and npm data. How scores work