metawake/ragtune

EXPLAIN ANALYZE for RAG retrieval — inspect, debug, benchmark, and tune your retrieval layer

35
/ 100
Emerging

This tool helps AI engineers and machine learning practitioners ensure their RAG (Retrieval Augmented Generation) systems accurately find relevant information. You feed it your documents and a set of test questions, and it tells you how well your system retrieves the correct information. It helps identify issues and compare different retrieval settings, ultimately improving the quality of your AI application's responses.

Use this if you are building or maintaining a RAG system and need to systematically debug, benchmark, and monitor its retrieval performance using your own data and questions.

Not ideal if you need to evaluate the end-to-end quality of an LLM's generated answers, including aspects like fluency, coherence, or factuality beyond just retrieval.

AI-engineering LLM-development information-retrieval MLOps search-quality
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 13 / 25
Community 7 / 25

How are scores calculated?

Stars

10

Forks

1

Language

Go

License

MIT

Last pushed

Feb 25, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/metawake/ragtune"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.