rageval and RAG-Evaluator
Maintenance
0/25
Adoption
10/25
Maturity
16/25
Community
10/25
Maintenance
0/25
Adoption
3/25
Maturity
16/25
Community
14/25
Stars: 170
Forks: 10
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 4
Forks: 3
Downloads: —
Commits (30d): 0
Language: Python
License: MIT
Stale 6m
No Package
No Dependents
Stale 6m
No Package
No Dependents
About rageval
gomate-community/rageval
Evaluation tools for Retrieval-augmented Generation (RAG) methods.
This tool helps evaluate the performance of your Retrieval-Augmented Generation (RAG) systems. It takes the outputs from various stages of your RAG pipeline—like rewritten queries, retrieved documents, and generated answers—and provides comprehensive scores on how well your system is performing across aspects like answer correctness, factual consistency, and document relevance. It is designed for AI/ML engineers or researchers building and refining RAG-based applications.
AI-evaluation
NLP-benchmarking
Generative-AI-testing
LLM-performance
Information-retrieval-quality
About RAG-Evaluator
GURPREETKAURJETHRA/RAG-Evaluator
A library for evaluating Retrieval-Augmented Generation (RAG) systems
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work