rawveg/ollama-mcp

An MCP Server for Ollama

59
/ 100
Established

This project helps developers integrate their local Ollama-powered large language models (LLMs) with AI assistant clients like Claude Desktop or Cline. It takes your local Ollama setup, including various models and their functionalities, and exposes them as 'tools' that these AI assistants can use. This allows developers to leverage their self-hosted LLMs directly within compatible applications, enabling local AI capabilities.

143 stars. Available on npm.

Use this if you are a developer who wants to connect your local Ollama models and their built-in functionalities (like text generation, chat, or embeddings) directly to an MCP-compatible AI assistant application.

Not ideal if you are an end-user without a local Ollama setup or if you primarily interact with cloud-based LLMs without needing to integrate them into a specific local AI assistant client.

AI-development LLM-integration local-AI AI-assistant-tools developer-workflow
Maintenance 6 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 18 / 25

How are scores calculated?

Stars

143

Forks

24

Language

TypeScript

License

AGPL-3.0

Last pushed

Nov 10, 2025

Commits (30d)

0

Dependencies

5

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/rawveg/ollama-mcp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.