kozistr/pytorch_optimizer
optimizer & lr scheduler & loss function collections in PyTorch
This is a toolkit for machine learning engineers and researchers building deep learning models with PyTorch. It provides a comprehensive collection of advanced optimizers, learning rate schedulers, and loss functions. You can easily integrate these components into your model training workflows to achieve faster experimentation and improved model performance.
393 stars.
Use this if you are a PyTorch developer looking to quickly experiment with a wide range of modern optimization techniques and loss functions for training your deep learning models.
Not ideal if you are not using PyTorch or prefer to implement optimization algorithms from scratch.
Stars
393
Forks
37
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 01, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kozistr/pytorch_optimizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)