nanowell/AdEMAMix-Optimizer-Pytorch

The AdEMAMix Optimizer: Better, Faster, Older.

36
/ 100
Emerging

This project helps machine learning engineers and researchers improve the training process of their neural networks. By taking your existing network and training data, it outputs a more efficiently trained model, potentially achieving better performance in less time. It's for anyone building or experimenting with deep learning models who wants to optimize their training routines.

186 stars. No commits in the last 6 months.

Use this if you are training deep learning models and want to achieve better performance, faster convergence, or more stable training outcomes.

Not ideal if you are looking for a pre-trained model or a tool for data preprocessing rather than training optimization.

deep-learning-training neural-network-optimization machine-learning-research model-performance algorithm-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

186

Forks

10

Language

Python

License

MIT

Last pushed

Sep 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/nanowell/AdEMAMix-Optimizer-Pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.