sibirbil/SMB
Stochastic gradient descent with model building
This is an optimization algorithm designed to train deep learning models more efficiently. It takes your existing deep learning model and training data, and outputs a refined model that converges faster and generalizes better. This tool is for deep learning practitioners and researchers who build and train neural networks.
No commits in the last 6 months.
Use this if you are training deep learning models and want to achieve faster convergence and better generalization with less hyperparameter tuning than traditional SGD or Adam.
Not ideal if you are not working with deep learning models or prefer traditional optimization algorithms like SGD and Adam without exploring new methods.
Stars
27
Forks
4
Language
Python
License
—
Category
Last pushed
Feb 15, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/sibirbil/SMB"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jolars/sortedl1
Python package for Sorted L-One Penalized Estimation (SLOPE)
gugarosa/opytimizer
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
SENATOROVAI/gradient-descent-sgd-solver-course
Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters...
hiroyuki-kasai/SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
softmin/ReHLine-python
Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence