kach/gradient-descent-the-ultimate-optimizer
Code for our NeurIPS 2022 paper
This tool helps machine learning engineers and researchers by automating the often-manual process of tuning optimizer hyperparameters like learning rates and momentum. You provide your existing neural network models and training data, and the system automatically learns optimal hyperparameter settings. It outputs a more robust and better-performing trained model, reducing the need for tedious manual experimentation.
371 stars. No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning practitioner looking to improve model performance and reduce the time spent manually tuning optimizer hyperparameters for your deep learning models.
Not ideal if you are working with non-gradient-based optimization algorithms or do not have a strong understanding of deep learning training processes.
Stars
371
Forks
21
Language
Python
License
MIT
Category
Last pushed
Jan 13, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kach/gradient-descent-the-ultimate-optimizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)