octanove/expats
EXPATS: A Toolkit for Explainable Automated Text Scoring
This toolkit helps educators, researchers, and content creators quickly assess written text. You feed it a collection of essays, articles, or other text, and it automatically scores them based on criteria like writing quality or readability. This is designed for anyone needing to evaluate large volumes of text efficiently, such as a teacher grading essays or a publisher assessing content complexity.
No commits in the last 6 months.
Use this if you need to automate the scoring or readability assessment of large batches of written text and want to understand why a model assigned a particular score.
Not ideal if you only need to score a few pieces of text manually or are looking for a simple, off-the-shelf application without customization.
Stars
23
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 27, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/octanove/expats"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rmovva/HypotheSAEs
HypotheSAEs: hypothesizing interpretable relationships in text datasets using sparse...
interpretml/interpret-text
A library that incorporates state-of-the-art explainers for text-based machine learning models...
fdalvi/NeuroX
A Python library that encapsulates various methods for neuron interpretation and analysis in...
jalammar/ecco
Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations...
alexdyysp/ESIM-pytorch
中国高校计算机大赛--大数据挑战赛