NoteDance/optimizers
This project implements optimizers for TensorFlow and Keras, which can be used in the same way as Keras optimizers. Machine learning, Deep learning
This project offers enhanced optimization algorithms for machine learning models, helping improve how quickly and effectively models learn from data. It takes your existing deep learning model and training data, and outputs a refined model that performs better. Machine learning engineers and researchers who train neural networks would find this useful.
Use this if you are training deep learning models in TensorFlow or Keras and want to improve training stability, speed, or final model performance, especially when dealing with noisy data or large batch sizes.
Not ideal if you are working with traditional machine learning algorithms outside of deep learning frameworks or if you don't need fine-grained control over neural network training dynamics.
Stars
49
Forks
7
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NoteDance/optimizers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)