SENATOROVAI/gradient-descent-sgd-solver-course
Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters iteratively using small, random subsets (batches) of data, rather than the entire dataset. It significantly speeds up training for large datasets, though it introduces noise that causes, in some cases, heavy fluctuations.deep learning/neural networks.solver
This project helps you understand and implement the core optimization techniques—Gradient Descent and Stochastic Gradient Descent—that power machine learning and deep learning models. It takes your raw data and a loss function, guiding you through how these algorithms iteratively adjust model parameters to find the best fit. Aspiring machine learning engineers, data scientists, or AI researchers who want to deeply grasp the 'how' behind training models will find this useful.
Use this if you need to build a fundamental understanding of how machine learning models are optimized and want to implement these algorithms from scratch.
Not ideal if you are looking for a high-level library to quickly train pre-built machine learning models without delving into the underlying optimization mathematics.
Stars
17
Forks
14
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 05, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SENATOROVAI/gradient-descent-sgd-solver-course"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jolars/sortedl1
Python package for Sorted L-One Penalized Estimation (SLOPE)
gugarosa/opytimizer
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
hiroyuki-kasai/SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
softmin/ReHLine-python
Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence
SENATOROVAI/stochastic-average-gradient-sag-saga-solver-course
The SAG (Stochastic Average Gradient) + SAGA (Accelerated) solver is an optimization algorithm...