awslabs/adatune
Gradient based Hyperparameter Tuning library in PyTorch
This project helps machine learning practitioners efficiently train deep neural networks by automatically finding optimal learning rate schedules. It takes your PyTorch model and training setup, then outputs a model that has been trained more effectively by adapting the learning rate on the fly. This is designed for machine learning engineers, researchers, or data scientists working with deep learning models.
291 stars. No commits in the last 6 months.
Use this if you are training deep learning models in PyTorch and want to automatically optimize the learning rate to achieve better model performance with less manual tuning.
Not ideal if you are not using PyTorch for your deep learning models or prefer to manually control all hyperparameter adjustments.
Stars
291
Forks
32
Language
Python
License
Apache-2.0
Category
Last pushed
Jul 17, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/awslabs/adatune"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning