awslabs/adatune

Gradient based Hyperparameter Tuning library in PyTorch

41
/ 100
Emerging

This project helps machine learning practitioners efficiently train deep neural networks by automatically finding optimal learning rate schedules. It takes your PyTorch model and training setup, then outputs a model that has been trained more effectively by adapting the learning rate on the fly. This is designed for machine learning engineers, researchers, or data scientists working with deep learning models.

291 stars. No commits in the last 6 months.

Use this if you are training deep learning models in PyTorch and want to automatically optimize the learning rate to achieve better model performance with less manual tuning.

Not ideal if you are not using PyTorch for your deep learning models or prefer to manually control all hyperparameter adjustments.

deep-learning model-training hyperparameter-optimization neural-networks machine-learning-engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

291

Forks

32

Language

Python

License

Apache-2.0

Last pushed

Jul 17, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/awslabs/adatune"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.