sibirbil/SMB

Stochastic gradient descent with model building

27
/ 100
Experimental

This is an optimization algorithm designed to train deep learning models more efficiently. It takes your existing deep learning model and training data, and outputs a refined model that converges faster and generalizes better. This tool is for deep learning practitioners and researchers who build and train neural networks.

No commits in the last 6 months.

Use this if you are training deep learning models and want to achieve faster convergence and better generalization with less hyperparameter tuning than traditional SGD or Adam.

Not ideal if you are not working with deep learning models or prefer traditional optimization algorithms like SGD and Adam without exploring new methods.

deep-learning-optimization neural-network-training machine-learning-research model-convergence stochastic-optimization
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 12 / 25

How are scores calculated?

Stars

27

Forks

4

Language

Python

License

Last pushed

Feb 15, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/sibirbil/SMB"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.