Kaleidophon/nlp-uncertainty-zoo
Model zoo for different kinds of uncertainty quantification methods used in Natural Language Processing, implemented in PyTorch.
This project offers a collection of pre-built models specifically designed to quantify how confident a Natural Language Processing (NLP) model is in its predictions. It takes raw text data as input and outputs not only the NLP model's prediction (like a classification or a generated sequence) but also a score indicating the uncertainty of that prediction. Researchers and engineers working on advanced NLP applications would use this to build more reliable and transparent AI systems.
No commits in the last 6 months. Available on PyPI.
Use this if you need to understand or improve the reliability of your NLP model's predictions by quantifying the uncertainty associated with them.
Not ideal if you are looking for a simple, off-the-shelf NLP solution that doesn't require deep understanding or customization of uncertainty quantification methods.
Stars
55
Forks
4
Language
Python
License
—
Category
Last pushed
May 05, 2023
Commits (30d)
0
Dependencies
16
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Kaleidophon/nlp-uncertainty-zoo"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale...
naru-project/naru
Neural Relation Understanding: neural cardinality estimators for tabular data
neurocard/neurocard
State-of-the-art neural cardinality estimators for join queries
danielzuegner/code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from...
salesforce/CodeTF
CodeTF: One-stop Transformer Library for State-of-the-art Code LLM