NVlabs/DIODE
Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
DIODE helps machine learning engineers improve the performance of smaller object detection models without needing the original training data. You provide an existing, larger object detection model, and it generates synthetic images with varied objects and scenes. These generated images are then used to train a smaller, "student" model, enabling it to learn from the larger "teacher" model's knowledge.
No commits in the last 6 months.
Use this if you need to transfer the capabilities of a high-performing but large object detection model to a more efficient, smaller model, especially when the original training dataset is unavailable or sensitive.
Not ideal if you already have the original training dataset readily available or if your goal is not to distill knowledge to a smaller model.
Stars
63
Forks
7
Language
Jupyter Notebook
License
—
Category
Last pushed
Oct 12, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NVlabs/DIODE"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)