hutec/UncertaintyNN
Implementation and evaluation of different approaches to get uncertainty in neural networks
This project helps machine learning engineers and researchers assess the reliability of their neural network predictions. It takes a trained neural network and applies different uncertainty quantification techniques to determine not just what the network predicts, but also how confident it is in that prediction. The output provides insights into the model's certainty, crucial for high-stakes applications.
142 stars. No commits in the last 6 months.
Use this if you are developing or deploying neural networks and need to understand the confidence or uncertainty associated with your model's outputs, especially in domains like medical imaging, autonomous driving, or financial forecasting.
Not ideal if you are looking for general-purpose neural network training tools rather than specific methods for quantifying model uncertainty.
Stars
142
Forks
29
Language
Jupyter Notebook
License
—
Last pushed
Feb 16, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/hutec/UncertaintyNN"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EmuKit/emukit
A Python-based toolbox of various methods in decision making, uncertainty quantification and...
google/uncertainty-baselines
High-quality implementations of standard and SOTA methods on a variety of tasks.
nielstron/quantulum3
Library for unit extraction - fork of quantulum for python3
IBM/UQ360
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you...
aamini/evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!