dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
This is a curated collection of research papers focused on 'knowledge distillation,' a technique in machine learning. It helps practitioners who need to make large, complex machine learning models more efficient for real-world deployment. You'll find papers detailing how to compress high-performing but resource-intensive models into smaller, faster ones, while retaining much of their accuracy. This resource is for machine learning engineers, data scientists, or researchers who are optimizing model performance for deployment in resource-constrained environments.
3,825 stars. Actively maintained with 1 commit in the last 30 days.
Use this if you need to research methods for compressing large, accurate machine learning models into smaller, more efficient versions for faster inference or deployment on edge devices.
Not ideal if you are looking for ready-to-use code, tutorials, or a direct implementation of knowledge distillation algorithms.
Stars
3,825
Forks
513
Language
—
License
Apache-2.0
Category
Last pushed
Dec 25, 2025
Commits (30d)
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dkozlov/awesome-knowledge-distillation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related frameworks
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)