decile-team/cords

Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using coresets and data selection.

56
/ 100
Established

This tool helps machine learning engineers and researchers dramatically cut down the time, computational resources, and energy needed to train deep learning models. By intelligently selecting smaller, highly representative subsets from large datasets, it reduces training time from days to hours, or hours to minutes. You input your full dataset and your deep learning model, and it outputs a highly efficient training process that yields a well-performing model much faster.

346 stars. No commits in the last 6 months. Available on PyPI.

Use this if you are a machine learning practitioner struggling with long training times, high GPU costs, or excessive energy consumption for your deep learning models.

Not ideal if your datasets are already very small or if you are not working with deep learning models, as the benefits of data selection might not apply.

deep-learning-optimization machine-learning-engineering model-training-efficiency resource-management hyperparameter-tuning
Stale 6m
Maintenance 0 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 21 / 25

How are scores calculated?

Stars

346

Forks

62

Language

Jupyter Notebook

License

MIT

Last pushed

May 24, 2023

Commits (30d)

0

Dependencies

21

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/decile-team/cords"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.