Knowledge Distillation Frameworks
Libraries and implementations for knowledge distillation techniques that compress neural networks by transferring knowledge from teacher to student models. Does NOT include general model compression, pruning, quantization, or dataset distillation approaches.
There are 35 knowledge distillation frameworks tracked. 3 score above 50 (established tier). The highest-rated is Guang000/Awesome-Dataset-Distillation at 67/100 with 1,909 stars. 2 of the top 10 are actively maintained.
Get all 35 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=ml-frameworks&subcategory=knowledge-distillation-frameworks&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Framework | Score | Tier |
|---|---|---|---|
| 1 |
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications. |
|
Established |
| 2 |
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation |
|
Established |
| 3 |
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending... |
|
Established |
| 4 |
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with... |
|
Emerging |
| 5 |
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023) |
|
Emerging |
| 6 |
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022) |
|
Emerging |
| 7 |
szq0214/FKD
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework... |
|
Emerging |
| 8 |
decile-team/distil
DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active... |
|
Emerging |
| 9 |
megvii-research/mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation... |
|
Emerging |
| 10 |
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。 |
|
Emerging |
| 11 |
LutingWang/awesome-knowledge-distillation-for-object-detection
A curated list of awesome knowledge distillation papers and codes for object... |
|
Emerging |
| 12 |
VITA-Group/SymbolicPCC
📜 [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control",... |
|
Emerging |
| 13 |
circleLZY/MTKD-CD
Official implementation for "JL1-CD: A New Benchmark for Remote Sensing... |
|
Emerging |
| 14 |
NVlabs/DIODE
Official PyTorch implementation of Data-free Knowledge Distillation for... |
|
Emerging |
| 15 |
BatsResearch/csp
Learning to compose soft prompts for compositional zero-shot learning. |
|
Emerging |
| 16 |
DefangChen/SemCKD
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation... |
|
Emerging |
| 17 |
DefangChen/SimKD
[CVPR-2022] Official implementation for "Knowledge Distillation with the... |
|
Emerging |
| 18 |
ZuchniakK/MTKD
Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used... |
|
Emerging |
| 19 |
Adamdad/KnowledgeFactor
[ECCV2022] Factorizing Knowledge in Neural Networks |
|
Emerging |
| 20 |
ViTAE-Transformer/SimDistill
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal... |
|
Emerging |
| 21 |
twinkle0331/LGTM
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning... |
|
Emerging |
| 22 |
wjun0830/Difficulty-Aware-Simulator
Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set... |
|
Emerging |
| 23 |
IPL-sharif/KD_Survey
A Comprehensive Survey on Knowledge Distillation |
|
Emerging |
| 24 |
ismail31416/LumiNet
The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge... |
|
Experimental |
| 25 |
lyxiang-casia/EKD
Official Implementation of Evidential Knowledge Distillation (ICCV 2025) |
|
Experimental |
| 26 |
juyongjiang/KaSA
[ICLR'25] Code for KaSA, an official implementation of "KaSA:... |
|
Experimental |
| 27 |
Smooth-humvee686/onpolicydistillation
🛠️ Apply on-policy distillation to enhance Qwen3-0.6b's performance on GSM8K... |
|
Experimental |
| 28 |
King-Rafat/STKD_CFMitigation
Mitigating carbon footprint for knowledge distillation based deep learning... |
|
Experimental |
| 29 |
AsafShul/PoDD
Official PyTorch Implementation for the "Distilling Datasets Into Less Than... |
|
Experimental |
| 30 |
mashijie1028/TrustDD
(Pattern Recognition 2025) Towards Trustworthy Dataset Distillation |
|
Experimental |
| 31 |
adrianrm99/separating_knowledge
[ICML 2025] Separating Knowledge with Procedural Data |
|
Experimental |
| 32 |
nphdang/FS-BBT
Black-box Few-shot Knowledge Distillation |
|
Experimental |
| 33 |
KefanZhan/YOLOv8-KD
Awesome Knowledge Distillation Methods Implemented on YOLOv8 |
|
Experimental |
| 34 |
pariyajebreili/ConKD-Lion
This project aims to improve the transfer of knowledge from a ResNet-101... |
|
Experimental |
| 35 |
PiaCuk/distillistic
Knowledge distillation algorithms in PyTorch. Source code to Pia Čuk's Master thesis. |
|
Experimental |