ollama-mcp and claude-code-mcp
Both tools are independent MCP servers designed to integrate different large language models (Ollama and Claude, respectively) into the MCP ecosystem, making them ecosystem siblings that provide similar functionality for distinct LLM backends.
About ollama-mcp
rawveg/ollama-mcp
An MCP Server for Ollama
This project helps developers integrate their local Ollama-powered large language models (LLMs) with AI assistant clients like Claude Desktop or Cline. It takes your local Ollama setup, including various models and their functionalities, and exposes them as 'tools' that these AI assistants can use. This allows developers to leverage their self-hosted LLMs directly within compatible applications, enabling local AI capabilities.
About claude-code-mcp
KunihiroS/claude-code-mcp
MCP Server connects with claude code local command.
This project helps developers integrate the powerful Claude Code AI directly into other applications that use the Model Context Protocol (MCP). It acts as a bridge, taking requests in JSON format containing code or commands, routing them to your local Claude Code installation, and returning the AI's response, such as code explanations, reviews, fixes, or test suggestions. A developer building or using an MCP host application would use this to leverage Claude Code's capabilities within their existing tools.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work