OpenKG-ORG/EasyDetect
An Easy-to-use Hallucination Detection Framework for LLMs.
EasyDetect helps ensure that responses from large language models that process both images and text (like GPT-4V or Gemini) are accurate and don't "make things up." It takes an image and a text description, or a text prompt and a generated image, and checks for inconsistencies. This tool is for researchers, developers, or quality assurance teams working with multimodal AI to verify the reliability of AI-generated content.
No commits in the last 6 months.
Use this if you need to systematically identify and categorize factual errors or inconsistencies (hallucinations) in text generated by AI models based on images, or images generated by AI models based on text.
Not ideal if you are only working with text-based AI models or if you need to detect issues beyond factual hallucinations, such as bias or toxicity.
Stars
63
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 21, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/OpenKG-ORG/EasyDetect"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.