knagrecha/saturn

Saturn accelerates the training of large-scale deep learning models with a novel joint optimization approach.

37
/ 100
Emerging

This system helps machine learning engineers and researchers efficiently train multiple large deep learning models simultaneously, especially during hyperparameter optimization or model selection. You provide your training jobs, and it automatically manages resources and parallelization techniques to speed up the process. The output is significantly faster and more optimized model training results.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher regularly training multiple large deep learning models and need to optimize their training time and resource utilization.

Not ideal if you are only training a single, small deep learning model or are not concerned with optimizing training efficiency across multiple models.

deep-learning model-training hyperparameter-optimization machine-learning-engineering mlops
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

24

Forks

5

Language

Python

License

Apache-2.0

Last pushed

Nov 22, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/knagrecha/saturn"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.