mcp-rag-server and mcp-ragchat

mcp-rag-server
38
Emerging
mcp-ragchat
38
Emerging
Maintenance 2/25
Adoption 7/25
Maturity 16/25
Community 13/25
Maintenance 10/25
Adoption 4/25
Maturity 12/25
Community 12/25
Stars: 25
Forks: 4
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
Stars: 1
Forks: 1
Downloads: 14
Commits (30d): 0
Language: TypeScript
License:
Stale 6m No Package No Dependents
No License

About mcp-rag-server

kwanLeeFrmVi/mcp-rag-server

mcp-rag-server is a Model Context Protocol (MCP) server that enables Retrieval Augmented Generation (RAG) capabilities. It empowers Large Language Models (LLMs) to answer questions based on your document content by indexing and retrieving relevant information efficiently.

This is a tool for developers who integrate large language models (LLMs) into applications. It takes your collection of documents, like text files or markdown, and turns them into a searchable index. This index then helps your LLM provide more accurate and context-aware answers based on your specific content, rather than just its general training data.

LLM-integration developer-tool information-retrieval contextual-AI AI-application-development

About mcp-ragchat

gogabrielordonez/mcp-ragchat

MCP server that adds RAG-powered AI chat to any website. One command from Claude Code. Local vector store, multi-provider LLM (OpenAI/Anthropic/Gemini). Zero cloud dependency.

Scores updated daily from GitHub, PyPI, and npm data. How scores work