zjunlp/EasyDetect
[ACL 2024] An Easy-to-use Hallucination Detection Framework for LLMs.
When working with AI models that generate both text and images, EasyDetect helps you identify if the AI is making up information or presenting facts incorrectly. It takes the AI's visual input/output and its generated text, then tells you whether the claims are accurate or 'hallucinated'. This tool is for AI researchers and developers who are building or evaluating multimodal AI systems.
No commits in the last 6 months.
Use this if you need to systematically check if your Multimodal Large Language Models (MLLMs) are generating false or inconsistent information in their image descriptions or image generations.
Not ideal if you are looking for a general-purpose AI fact-checker for text-only models or if you are not directly involved in developing or evaluating MLLMs.
Stars
40
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 25, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/zjunlp/EasyDetect"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.