hutec/UncertaintyNN

Implementation and evaluation of different approaches to get uncertainty in neural networks

38
/ 100
Emerging

This project helps machine learning engineers and researchers assess the reliability of their neural network predictions. It takes a trained neural network and applies different uncertainty quantification techniques to determine not just what the network predicts, but also how confident it is in that prediction. The output provides insights into the model's certainty, crucial for high-stakes applications.

142 stars. No commits in the last 6 months.

Use this if you are developing or deploying neural networks and need to understand the confidence or uncertainty associated with your model's outputs, especially in domains like medical imaging, autonomous driving, or financial forecasting.

Not ideal if you are looking for general-purpose neural network training tools rather than specific methods for quantifying model uncertainty.

machine-learning-engineering model-evaluation deep-learning-research predictive-modeling risk-assessment
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 20 / 25

How are scores calculated?

Stars

142

Forks

29

Language

Jupyter Notebook

License

Last pushed

Feb 16, 2018

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/hutec/UncertaintyNN"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.