motasemwed/optimization-algorithms-comparison
A practical comparison of classical optimization algorithms (GD, SGD, Momentum, Adam, RMSProp, Adagrad, Newton) analyzing convergence speed, stability, and loss minimization for machine learning.
Stars
—
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Jan 29, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/motasemwed/optimization-algorithms-comparison"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)