rawveg/ollama-mcp
An MCP Server for Ollama
This project helps developers integrate their local Ollama-powered large language models (LLMs) with AI assistant clients like Claude Desktop or Cline. It takes your local Ollama setup, including various models and their functionalities, and exposes them as 'tools' that these AI assistants can use. This allows developers to leverage their self-hosted LLMs directly within compatible applications, enabling local AI capabilities.
143 stars. Available on npm.
Use this if you are a developer who wants to connect your local Ollama models and their built-in functionalities (like text generation, chat, or embeddings) directly to an MCP-compatible AI assistant application.
Not ideal if you are an end-user without a local Ollama setup or if you primarily interact with cloud-based LLMs without needing to integrate them into a specific local AI assistant client.
Stars
143
Forks
24
Language
TypeScript
License
AGPL-3.0
Category
Last pushed
Nov 10, 2025
Commits (30d)
0
Dependencies
5
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/rawveg/ollama-mcp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related servers
DMontgomery40/deepseek-mcp-server
Model Context Protocol server for DeepSeek's advanced language models
upstash/context7
Context7 Platform -- Up-to-date code documentation for LLMs and AI code editors
graphlit/graphlit-mcp-server
Model Context Protocol (MCP) Server for Graphlit Platform
dvcrn/mcp-server-siri-shortcuts
MCP for calling Siri Shorcuts from LLMs
e2b-dev/mcp-server
Giving Claude ability to run code with E2B via MCP (Model Context Protocol)