FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
This collection helps machine learning practitioners find relevant research papers on 'knowledge distillation' — a technique to transfer knowledge from large, complex models to smaller, more efficient ones. It takes a research problem or interest in model optimization as input and provides an organized list of academic papers covering different methods and applications of knowledge distillation. Data scientists and ML engineers who need to deploy performant yet lightweight models would find this valuable.
2,654 stars. No commits in the last 6 months.
Use this if you are a machine learning practitioner researching methods to compress large models into smaller, faster versions without significant performance loss.
Not ideal if you are looking for ready-to-use code implementations or a tutorial for applying knowledge distillation.
Stars
2,654
Forks
335
Language
—
License
—
Category
Last pushed
May 30, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/FLHonker/Awesome-Knowledge-Distillation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)