nanowell/AdEMAMix-Optimizer-Pytorch
The AdEMAMix Optimizer: Better, Faster, Older.
This project helps machine learning engineers and researchers improve the training process of their neural networks. By taking your existing network and training data, it outputs a more efficiently trained model, potentially achieving better performance in less time. It's for anyone building or experimenting with deep learning models who wants to optimize their training routines.
186 stars. No commits in the last 6 months.
Use this if you are training deep learning models and want to achieve better performance, faster convergence, or more stable training outcomes.
Not ideal if you are looking for a pre-trained model or a tool for data preprocessing rather than training optimization.
Stars
186
Forks
10
Language
Python
License
MIT
Category
Last pushed
Sep 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/nanowell/AdEMAMix-Optimizer-Pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)