ExplainableML/BayesCap
(ECCV 2022) BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen Neural Networks
This helps scientists, engineers, and researchers working with existing computer vision models. It takes an already-trained image processing or analysis model and a small dataset, and outputs calibrated uncertainty estimates for that model's predictions. This allows practitioners to understand the reliability of their model's outputs without expensive retraining.
No commits in the last 6 months.
Use this if you need to understand the confidence or reliability of predictions from your deployed computer vision models (e.g., for image super-resolution, deblurring, inpainting, medical image translation, or autonomous driving depth estimation) without altering or retraining the original model.
Not ideal if you are developing new deep learning models from scratch and want to build uncertainty directly into the initial training process, rather than adding it to a frozen model.
Stars
50
Forks
4
Language
Jupyter Notebook
License
Apache-2.0
Last pushed
Dec 14, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ExplainableML/BayesCap"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EmuKit/emukit
A Python-based toolbox of various methods in decision making, uncertainty quantification and...
google/uncertainty-baselines
High-quality implementations of standard and SOTA methods on a variety of tasks.
nielstron/quantulum3
Library for unit extraction - fork of quantulum for python3
IBM/UQ360
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you...
aamini/evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!