clabrugere/evidential-deeplearning

Implementation of "Evidential Deep Learning to Quantify Classification Uncertainty" proposing a method to quantify uncertainty in a neural network.

36
/ 100
Emerging

This project helps machine learning developers build neural networks that can not only classify data but also express how certain they are about their predictions. Instead of just giving a classification, it provides a measure of 'evidence' for each class and an overall uncertainty score. This is for machine learning engineers and data scientists who are building classification models and need to understand the reliability of their model's outputs.

No commits in the last 6 months.

Use this if you are building a classification model and need to quantify the confidence or uncertainty in your predictions, especially in applications where incorrect classifications could have significant consequences.

Not ideal if you only need a standard classification output without any measure of prediction confidence or if you are not working with deep learning models.

machine-learning-engineering deep-learning model-uncertainty classification-models predictive-modeling
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

30

Forks

5

Language

Python

License

MIT

Last pushed

Sep 14, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/clabrugere/evidential-deeplearning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.