Knowledge Distillation Frameworks

Libraries and implementations for knowledge distillation techniques that compress neural networks by transferring knowledge from teacher to student models. Does NOT include general model compression, pruning, quantization, or dataset distillation approaches.

There are 35 knowledge distillation frameworks tracked. 3 score above 50 (established tier). The highest-rated is Guang000/Awesome-Dataset-Distillation at 67/100 with 1,909 stars. 2 of the top 10 are actively maintained.

Get all 35 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=ml-frameworks&subcategory=knowledge-distillation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Framework Score Tier
1 Guang000/Awesome-Dataset-Distillation

A curated list of awesome papers on dataset distillation and related applications.

67
Established
2 dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

57
Established
3 SforAiDl/KD_Lib

A Pytorch Knowledge Distillation library for benchmarking and extending...

52
Established
4 SakurajimaMaiii/ProtoKD

[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with...

48
Emerging
5 HikariTJU/LD

Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)

45
Emerging
6 yzd-v/FGD

Focal and Global Knowledge Distillation for Detectors (CVPR 2022)

44
Emerging
7 szq0214/FKD

Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework...

44
Emerging
8 decile-team/distil

DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active...

44
Emerging
9 megvii-research/mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation...

40
Emerging
10 FLHonker/Awesome-Knowledge-Distillation

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

39
Emerging
11 LutingWang/awesome-knowledge-distillation-for-object-detection

A curated list of awesome knowledge distillation papers and codes for object...

38
Emerging
12 VITA-Group/SymbolicPCC

📜 [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control",...

37
Emerging
13 circleLZY/MTKD-CD

Official implementation for "JL1-CD: A New Benchmark for Remote Sensing...

37
Emerging
14 NVlabs/DIODE

Official PyTorch implementation of Data-free Knowledge Distillation for...

36
Emerging
15 BatsResearch/csp

Learning to compose soft prompts for compositional zero-shot learning.

36
Emerging
16 DefangChen/SemCKD

[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation...

35
Emerging
17 DefangChen/SimKD

[CVPR-2022] Official implementation for "Knowledge Distillation with the...

35
Emerging
18 ZuchniakK/MTKD

Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used...

33
Emerging
19 Adamdad/KnowledgeFactor

[ECCV2022] Factorizing Knowledge in Neural Networks

33
Emerging
20 ViTAE-Transformer/SimDistill

The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal...

31
Emerging
21 twinkle0331/LGTM

[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning...

31
Emerging
22 wjun0830/Difficulty-Aware-Simulator

Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set...

31
Emerging
23 IPL-sharif/KD_Survey

A Comprehensive Survey on Knowledge Distillation

30
Emerging
24 ismail31416/LumiNet

The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge...

29
Experimental
25 lyxiang-casia/EKD

Official Implementation of Evidential Knowledge Distillation (ICCV 2025)

29
Experimental
26 juyongjiang/KaSA

[ICLR'25] Code for KaSA, an official implementation of "KaSA:...

29
Experimental
27 Smooth-humvee686/onpolicydistillation

🛠️ Apply on-policy distillation to enhance Qwen3-0.6b's performance on GSM8K...

27
Experimental
28 King-Rafat/STKD_CFMitigation

Mitigating carbon footprint for knowledge distillation based deep learning...

24
Experimental
29 AsafShul/PoDD

Official PyTorch Implementation for the "Distilling Datasets Into Less Than...

23
Experimental
30 mashijie1028/TrustDD

(Pattern Recognition 2025) Towards Trustworthy Dataset Distillation

21
Experimental
31 adrianrm99/separating_knowledge

[ICML 2025] Separating Knowledge with Procedural Data

21
Experimental
32 nphdang/FS-BBT

Black-box Few-shot Knowledge Distillation

19
Experimental
33 KefanZhan/YOLOv8-KD

Awesome Knowledge Distillation Methods Implemented on YOLOv8

15
Experimental
34 pariyajebreili/ConKD-Lion

This project aims to improve the transfer of knowledge from a ResNet-101...

14
Experimental
35 PiaCuk/distillistic

Knowledge distillation algorithms in PyTorch. Source code to Pia Čuk's Master thesis.

12
Experimental