wyzjack/AdaM3

[ICDM 2023] Momentum is All You Need for Data-Driven Adaptive Optimization

19
/ 100
Experimental

AdaM3 helps machine learning researchers and practitioners train deep neural networks more effectively. By modifying how adaptive gradient methods like Adam adjust learning rates, it provides a more stable and faster training process. The input is a deep learning model and training data, and the output is a trained model that generalizes better to new, unseen data, overcoming common issues like getting stuck in local minima.

No commits in the last 6 months.

Use this if you are developing or training deep learning models and need an optimizer that offers both rapid training and superior generalization performance.

Not ideal if you are not working with deep learning models or do not have issues with current adaptive optimizers like Adam.

deep-learning-optimization neural-network-training model-generalization machine-learning-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 4 / 25

How are scores calculated?

Stars

26

Forks

1

Language

Python

License

Last pushed

Mar 30, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/wyzjack/AdaM3"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.