spcl/MRAG

Official Implementation of "Multi-Head RAG: Solving Multi-Aspect Problems with LLMs"

51
/ 100
Established

This project helps developers working with large language models (LLMs) to improve information retrieval for complex queries. It takes queries that require diverse information and a collection of documents, then retrieves more relevant documents by understanding different facets of the query and documents. LLM developers, AI researchers, or data scientists building retrieval-augmented generation (RAG) systems would use this.

240 stars.

Use this if your LLM application struggles to retrieve all necessary information when user queries touch on multiple, distinct topics within your document base.

Not ideal if you are looking for a simple keyword search or only deal with queries that target very specific, single-topic documents.

LLM development information retrieval natural language processing AI research RAG systems
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

240

Forks

25

Language

Python

License

Last pushed

Feb 26, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/spcl/MRAG"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.