fadel/pytorch_ema
Tiny PyTorch library for maintaining a moving average of a collection of parameters.
This tool helps machine learning engineers improve the stability and performance of their deep learning models during and after training. By maintaining an exponential moving average (EMA) of model weights, it takes in your trained model's parameters and outputs a more robust set of parameters. This results in models that often generalize better to new, unseen data, which is crucial for deployment.
444 stars. No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to enhance the generalization ability and robustness of your PyTorch models by applying exponential moving averages to their parameters.
Not ideal if you are working with non-PyTorch deep learning frameworks or if you do not need to apply moving averages to model parameters for performance gains.
Stars
444
Forks
27
Language
Python
License
MIT
Category
Last pushed
Oct 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/fadel/pytorch_ema"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pytorch/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
keras-team/keras
Deep Learning for humans
Lightning-AI/torchmetrics
Machine learning metrics for distributed, scalable PyTorch applications.
Lightning-AI/pytorch-lightning
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
lanpa/tensorboardX
tensorboard for pytorch (and chainer, mxnet, numpy, ...)