adrienkegreisz/ano-experiments

The source code of the ANO's paper – a robust optimizer for deep learning in noisy settings.

21
/ 100
Experimental

This project offers a novel optimization method, Ano, for training deep learning models more effectively, especially in environments with unpredictable or 'noisy' data. It takes your deep learning model and training data, and outputs a more robustly trained model by adjusting how its internal parameters are updated. Data scientists, machine learning engineers, and researchers working on deep learning applications will find this useful.

No commits in the last 6 months.

Use this if you are training deep learning models in challenging conditions, such as with noisy data or in non-stationary environments, and want to improve training speed and performance over standard optimizers like Adam or Adan.

Not ideal if you are not working with deep learning models or if your current training setup already performs optimally without issues related to gradient noise.

deep-learning machine-learning-engineering model-training computer-vision natural-language-processing
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 15 / 25
Community 0 / 25

How are scores calculated?

Stars

7

Forks

Language

Python

License

MIT

Last pushed

Jul 31, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/adrienkegreisz/ano-experiments"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.