thetechdude124/Adam-Optimization-From-Scratch
📈Implementing the ADAM optimizer from the ground up with PyTorch and comparing its performance on six 3-D objective functions (each progressively more difficult to optimize) against SGD, AdaGrad, and RMSProp.
This project helps machine learning practitioners efficiently optimize complex models, particularly those with many parameters or high-dimensional data. It takes a mathematical description of a model's loss function and its parameters, and outputs optimized parameter values that lead to better model performance. Data scientists, machine learning engineers, and researchers can use this to accelerate the training of deep learning models.
No commits in the last 6 months.
Use this if you are developing or training machine learning models and need a robust, computationally efficient method to find optimal model parameters, especially for large-scale or complex problems.
Not ideal if your primary goal is to understand basic gradient descent for very simple, low-dimensional problems, or if you require an optimizer that is invariant to hyperparameter choices.
Stars
22
Forks
4
Language
Jupyter Notebook
License
—
Category
Last pushed
Jul 02, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thetechdude124/Adam-Optimization-From-Scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)