adrienkegreisz/ano-experiments
The source code of the ANO's paper – a robust optimizer for deep learning in noisy settings.
This project offers a novel optimization method, Ano, for training deep learning models more effectively, especially in environments with unpredictable or 'noisy' data. It takes your deep learning model and training data, and outputs a more robustly trained model by adjusting how its internal parameters are updated. Data scientists, machine learning engineers, and researchers working on deep learning applications will find this useful.
No commits in the last 6 months.
Use this if you are training deep learning models in challenging conditions, such as with noisy data or in non-stationary environments, and want to improve training speed and performance over standard optimizers like Adam or Adan.
Not ideal if you are not working with deep learning models or if your current training setup already performs optimally without issues related to gradient noise.
Stars
7
Forks
—
Language
Python
License
MIT
Category
Last pushed
Jul 31, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/adrienkegreisz/ano-experiments"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)