aurelio-amerio/ConcreteDropout
Concrete Dropout implementation for Tensorflow 2.0 and PyTorch
When building deep learning models, it's often hard to know how confident your model is in its predictions. This project provides specialized neural network layers that automatically learn the optimal amount of 'dropout' – a technique to prevent overfitting. It takes your existing TensorFlow or PyTorch model layers and outputs enhanced layers that can better quantify prediction uncertainty. This is for machine learning engineers or researchers who need to build more robust and interpretable models.
No commits in the last 6 months. Available on PyPI.
Use this if you are developing neural networks and need to improve model robustness, prevent overfitting, and understand the uncertainty of your model's predictions without extensive manual tuning.
Not ideal if you are a beginner looking for a simple plug-and-play solution without understanding deep learning concepts like dropout and regularization.
Stars
14
Forks
4
Language
Python
License
MIT
Category
Last pushed
Dec 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aurelio-amerio/ConcreteDropout"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kk7nc/RMDL
RMDL: Random Multimodel Deep Learning for Classification
MaximeVandegar/Papers-in-100-Lines-of-Code
Implementation of papers in 100 lines of code.
OML-Team/open-metric-learning
Metric learning and retrieval pipelines, models and zoo.
miguelvr/dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
DLTK/DLTK
Deep Learning Toolkit for Medical Image Analysis