aurelio-amerio/ConcreteDropout

Concrete Dropout implementation for Tensorflow 2.0 and PyTorch

45
/ 100
Emerging

When building deep learning models, it's often hard to know how confident your model is in its predictions. This project provides specialized neural network layers that automatically learn the optimal amount of 'dropout' – a technique to prevent overfitting. It takes your existing TensorFlow or PyTorch model layers and outputs enhanced layers that can better quantify prediction uncertainty. This is for machine learning engineers or researchers who need to build more robust and interpretable models.

No commits in the last 6 months. Available on PyPI.

Use this if you are developing neural networks and need to improve model robustness, prevent overfitting, and understand the uncertainty of your model's predictions without extensive manual tuning.

Not ideal if you are a beginner looking for a simple plug-and-play solution without understanding deep learning concepts like dropout and regularization.

deep-learning-model-building model-uncertainty-quantification neural-network-regularization overfitting-prevention bayesian-deep-learning
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 25 / 25
Community 15 / 25

How are scores calculated?

Stars

14

Forks

4

Language

Python

License

MIT

Last pushed

Dec 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aurelio-amerio/ConcreteDropout"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.