dice-group/gerbil
GERBIL - General Entity annotatoR Benchmark
GERBIL helps evaluate the performance of different entity annotation and disambiguation tools. It takes in text content and a set of annotation tools, then outputs a detailed benchmark comparing how well each tool identifies and links entities within the text. This is useful for researchers and developers working on natural language processing applications.
230 stars.
Use this if you need to rigorously compare various entity recognition and linking systems or question answering systems.
Not ideal if you are an end-user simply looking to apply an entity annotation tool, rather than benchmark one.
Stars
230
Forks
57
Language
Java
License
AGPL-3.0
Category
Last pushed
Jan 29, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/dice-group/gerbil"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
MantisAI/nervaluate
Full named-entity (i.e., not tag/token) evaluation metrics based on SemEval’13
bltlab/seqscore
SeqScore: Scoring for named entity recognition and other sequence labeling tasks
syuoni/eznlp
Easy Natural Language Processing
LHNCBC/metamaplite
A near real-time named-entity recognizer
OpenJarbas/simple_NER
simple rule based named entity recognition