decile-team/distil
DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active learning library built on py-torch for reducing labeling costs.
This project helps machine learning practitioners significantly reduce the time and cost associated with labeling large datasets for deep learning models. By intelligently selecting only the most informative data points for human annotation, it drastically cuts down on the amount of data that needs to be manually labeled. Data scientists and ML engineers can use this to get high-performing models with a fraction of the usual labeling effort.
156 stars. No commits in the last 6 months.
Use this if you are building deep learning models and spend too much time or money on manually labeling large datasets.
Not ideal if your datasets are small, already perfectly labeled, or if you are not working with deep learning models.
Stars
156
Forks
26
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Feb 05, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/decile-team/distil"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)