ildoonet/pytorch-gradual-warmup-lr
Gradually-Warmup Learning Rate Scheduler for PyTorch
This tool helps machine learning engineers and researchers manage the learning rate of their PyTorch models during the initial stages of training. It takes your existing optimizer and a defined learning rate schedule, and then gradually increases the learning rate over a specified number of epochs. This process helps stabilize model training, especially when using large batch sizes, leading to better overall performance.
991 stars. No commits in the last 6 months.
Use this if you are training deep learning models in PyTorch and want to apply a 'warm-up' period to your learning rate schedule to improve training stability and model convergence.
Not ideal if you are not using PyTorch for model training or if your existing training setup does not require a gradual learning rate warm-up phase.
Stars
991
Forks
126
Language
Python
License
MIT
Category
Last pushed
Oct 10, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ildoonet/pytorch-gradual-warmup-lr"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)