decile-team/cords
Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using coresets and data selection.
This tool helps machine learning engineers and researchers dramatically cut down the time, computational resources, and energy needed to train deep learning models. By intelligently selecting smaller, highly representative subsets from large datasets, it reduces training time from days to hours, or hours to minutes. You input your full dataset and your deep learning model, and it outputs a highly efficient training process that yields a well-performing model much faster.
346 stars. No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning practitioner struggling with long training times, high GPU costs, or excessive energy consumption for your deep learning models.
Not ideal if your datasets are already very small or if you are not working with deep learning models, as the benefits of data selection might not apply.
Stars
346
Forks
62
Language
Jupyter Notebook
License
MIT
Category
Last pushed
May 24, 2023
Commits (30d)
0
Dependencies
21
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/decile-team/cords"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
feature-engine/feature_engine
Feature engineering and selection open-source Python library compatible with sklearn.
alteryx/featuretools
An open source python library for automated feature engineering
cod3licious/autofeat
Linear Prediction Model with Automated Feature Engineering and Selection Capabilities
abess-team/abess
Fast Best-Subset Selection Library
abhayspawar/featexp
Feature exploration for supervised learning