aamini/evidential-deep-learning

Learn fast, scalable, and calibrated measures of uncertainty using neural networks!

59
/ 100
Established

This project helps machine learning engineers build more reliable AI models. It takes a standard neural network and modifies its final layers to output not just a prediction, but also a quantifiable measure of confidence in that prediction. This allows developers to understand when their models are uncertain, even on new or unusual data, making AI systems safer and more trustworthy.

513 stars. No commits in the last 6 months. Available on PyPI.

Use this if you are developing AI models where understanding the model's confidence in its predictions is critical for safety or decision-making.

Not ideal if you are a business user or practitioner simply looking for a ready-to-use AI solution, as this requires direct modification of neural network architectures.

machine-learning-engineering model-uncertainty AI-safety neural-network-development predictive-modeling
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 24 / 25

How are scores calculated?

Stars

513

Forks

101

Language

Python

License

Apache-2.0

Last pushed

Aug 31, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aamini/evidential-deep-learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.