AlaaLab/deep-learning-uncertainty
Literature survey, paper reviews, experimental setups and a collection of implementations for baselines methods for predictive uncertainty estimation in deep learning models.
This resource provides a collection of research papers and practical code examples for understanding and implementing methods to estimate predictive uncertainty in deep learning models. It helps data scientists and machine learning engineers assess how confident their deep learning model's predictions are. The input is trained deep learning models and the output is a quantification of prediction uncertainty.
640 stars. No commits in the last 6 months.
Use this if you are a data scientist or machine learning engineer who needs to understand and implement techniques to measure the reliability and confidence of predictions from deep learning models.
Not ideal if you are a non-technical user looking for a ready-to-use tool without delving into the underlying statistical and machine learning concepts.
Stars
640
Forks
78
Language
Jupyter Notebook
License
—
Last pushed
Aug 01, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AlaaLab/deep-learning-uncertainty"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EmuKit/emukit
A Python-based toolbox of various methods in decision making, uncertainty quantification and...
google/uncertainty-baselines
High-quality implementations of standard and SOTA methods on a variety of tasks.
nielstron/quantulum3
Library for unit extraction - fork of quantulum for python3
IBM/UQ360
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you...
aamini/evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!