ildoonet/pytorch-gradual-warmup-lr

Gradually-Warmup Learning Rate Scheduler for PyTorch

47
/ 100
Emerging

This tool helps machine learning engineers and researchers manage the learning rate of their PyTorch models during the initial stages of training. It takes your existing optimizer and a defined learning rate schedule, and then gradually increases the learning rate over a specified number of epochs. This process helps stabilize model training, especially when using large batch sizes, leading to better overall performance.

991 stars. No commits in the last 6 months.

Use this if you are training deep learning models in PyTorch and want to apply a 'warm-up' period to your learning rate schedule to improve training stability and model convergence.

Not ideal if you are not using PyTorch for model training or if your existing training setup does not require a gradual learning rate warm-up phase.

deep-learning-training pytorch-development model-optimization neural-network-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

991

Forks

126

Language

Python

License

MIT

Last pushed

Oct 10, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ildoonet/pytorch-gradual-warmup-lr"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.