DefangChen/SimKD
[CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".
When building image classification models, this toolbox helps machine learning engineers improve the performance of smaller, more efficient 'student' models by transferring knowledge from larger, more accurate 'teacher' models. It takes a pre-trained teacher model and training data (like CIFAR-100 or ImageNet) as input, and outputs an optimized student model that performs better on classification tasks than if it were trained alone. This is ideal for those working on deploying efficient computer vision models.
102 stars. No commits in the last 6 months.
Use this if you need to create compact and efficient image classification models that maintain high accuracy, leveraging state-of-the-art knowledge distillation techniques.
Not ideal if you are working with non-image data types or need to develop models from scratch without relying on a teacher model for knowledge transfer.
Stars
102
Forks
19
Language
Python
License
—
Category
Last pushed
Jun 16, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/DefangChen/SimKD"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)