kach/gradient-descent-the-ultimate-optimizer

Code for our NeurIPS 2022 paper

47
/ 100
Emerging

This tool helps machine learning engineers and researchers by automating the often-manual process of tuning optimizer hyperparameters like learning rates and momentum. You provide your existing neural network models and training data, and the system automatically learns optimal hyperparameter settings. It outputs a more robust and better-performing trained model, reducing the need for tedious manual experimentation.

371 stars. No commits in the last 6 months. Available on PyPI.

Use this if you are a machine learning practitioner looking to improve model performance and reduce the time spent manually tuning optimizer hyperparameters for your deep learning models.

Not ideal if you are working with non-gradient-based optimization algorithms or do not have a strong understanding of deep learning training processes.

deep-learning-optimization hyperparameter-tuning neural-network-training machine-learning-research model-performance-tuning
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 12 / 25

How are scores calculated?

Stars

371

Forks

21

Language

Python

License

MIT

Last pushed

Jan 13, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kach/gradient-descent-the-ultimate-optimizer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.