neomatrix369/AIE7-Demo-Day-Project

RagCheck is a proactive corpus quality assessment tool that analyses RAG application document collections before deployment, identifying content gaps and providing specific recommendations to improve query performance. The platform transforms reactive corpus fixes into proactive quality assurance, helping organisations achieve as high as 85% score.

18
/ 100
Experimental

This tool helps organizations building Retrieval Augmented Generation (RAG) applications ensure their document collections are high quality before deployment. You provide your documents and test queries, and it analyzes them to identify content gaps and suggests specific improvements. This is for AI product managers, RAG system developers, or content strategists who want to proactively optimize their RAG application's performance.

Use this if you need to assess and improve the quality of your RAG application's source documents to ensure accurate and relevant responses.

Not ideal if you are looking for a tool to evaluate a deployed RAG system's live performance rather than its underlying document corpus.

RAG application development AI content quality Information retrieval Knowledge base optimization LLM application readiness
No License No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 7 / 25
Community 0 / 25

How are scores calculated?

Stars

11

Forks

Language

TypeScript

License

Last pushed

Oct 27, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/neomatrix369/AIE7-Demo-Day-Project"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.