repo-graphrag-mcp and mcp-rag-server

Both tools are Model Context Protocol (MCP) servers designed for Retrieval Augmented Generation (RAG), making them **competitors** that offer similar functionalities for leveraging LLMs with contextual documents, though one emphasizes building a knowledge graph from code and docs specifically.

repo-graphrag-mcp
42
Emerging
mcp-rag-server
38
Emerging
Maintenance 10/25
Adoption 3/25
Maturity 15/25
Community 14/25
Maintenance 2/25
Adoption 7/25
Maturity 16/25
Community 13/25
Stars: 3
Forks: 3
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 25
Forks: 4
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No Package No Dependents
Stale 6m No Package No Dependents

About repo-graphrag-mcp

yumeiriowl/repo-graphrag-mcp

An MCP server that uses LightRAG and Tree-sitter to build a repository knowledge graph from code and docs, for Q&A and implementation planning.

About mcp-rag-server

kwanLeeFrmVi/mcp-rag-server

mcp-rag-server is a Model Context Protocol (MCP) server that enables Retrieval Augmented Generation (RAG) capabilities. It empowers Large Language Models (LLMs) to answer questions based on your document content by indexing and retrieving relevant information efficiently.

This is a tool for developers who integrate large language models (LLMs) into applications. It takes your collection of documents, like text files or markdown, and turns them into a searchable index. This index then helps your LLM provide more accurate and context-aware answers based on your specific content, rather than just its general training data.

LLM-integration developer-tool information-retrieval contextual-AI AI-application-development

Scores updated daily from GitHub, PyPI, and npm data. How scores work