RAGLight and rag-backend-core

RAGLight
68
Established
rag-backend-core
25
Experimental
Maintenance 20/25
Adoption 10/25
Maturity 16/25
Community 22/25
Maintenance 10/25
Adoption 2/25
Maturity 13/25
Community 0/25
Stars: 655
Forks: 99
Downloads:
Commits (30d): 33
Language: Python
License: MIT
Stars: 2
Forks:
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About RAGLight

Bessouat40/RAGLight

RAGLight is a modular framework for Retrieval-Augmented Generation (RAG). It makes it easy to plug in different LLMs, embeddings, and vector stores, and now includes seamless MCP integration to connect external tools and data sources.

RAGLight helps you quickly build a chatbot that can answer questions using your own documents, like PDFs, Word files, or code. You feed it your collection of files, and it produces a chat interface where you can ask questions and get answers grounded in your specific information. This is ideal for anyone who needs to quickly create a custom AI assistant that understands their unique knowledge base.

knowledge-management custom-chatbot document-intelligence information-retrieval AI-assistant-creation

About rag-backend-core

jonnamartiinUdemm/rag-backend-core

Modular RAG Backend API built with FastAPI. Features Hybrid LLM support (Gemini/Ollama), Qdrant Vector Search, Redis caching, and Async Celery processing.

Scores updated daily from GitHub, PyPI, and npm data. How scores work