Brokttv/optimizers-from-scratch
training models with different optimizers using NumPy only. Featuring SGD, Adam, Adagrad, NAG, RMSProp, and Momentum. This repo also includes a benchmark against Pytorch developed optims.
This project offers efficient, fundamental implementations of deep learning optimizers like Adam, SGD, and RMSProp using NumPy. It allows machine learning practitioners to train models and understand how different optimization algorithms work under the hood. You input your model architecture and training data, and it outputs the optimized model parameters.
No commits in the last 6 months.
Use this if you are a machine learning researcher or student who needs to understand, customize, or benchmark core optimization algorithms for small-to-medium scale regression or classification tasks without relying on high-level frameworks.
Not ideal if you are working on large-scale deep learning projects that require GPU acceleration, advanced model architectures, or the extensive features of frameworks like PyTorch or TensorFlow.
Stars
13
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 09, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Brokttv/optimizers-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)