jonigl/mcp-client-for-ollama

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

67
/ 100
Established

This interactive terminal application helps developers connect their local Large Language Models (LLMs) running on Ollama to external tools and services defined by the Model Context Protocol (MCP). It allows for real-time management of models, tools, and server connections, facilitating advanced LLM-driven automation and experimentation. Developers can input natural language prompts, and the system outputs responses, potentially enhanced by tool calls, all within a user-friendly text interface.

563 stars. Actively maintained with 1 commit in the last 30 days. Available on PyPI.

Use this if you are a developer building, testing, or exploring LLM-powered applications and need to integrate local Ollama models with custom tools via MCP servers without writing extensive boilerplate code.

Not ideal if you are an end-user looking for a ready-to-use AI assistant for general tasks, or if you don't work with local LLMs and external tool integration as a developer.

LLM-development tool-integration local-AI-models prompt-engineering AI-application-testing
Maintenance 13 / 25
Adoption 10 / 25
Maturity 24 / 25
Community 20 / 25

How are scores calculated?

Stars

563

Forks

82

Language

Python

License

MIT

Last pushed

Feb 19, 2026

Commits (30d)

1

Dependencies

5

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/jonigl/mcp-client-for-ollama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.