IPL-sharif/KD_Survey
A Comprehensive Survey on Knowledge Distillation
This resource provides a comprehensive overview of Knowledge Distillation (KD), a technique used to make large, complex AI models (like LLMs or Vision-Language Models) run efficiently on devices with limited computing power, such as edge devices. It takes in various KD methods and categorizes them by their sources, schemes, algorithms, and applications across different data types (like text, speech, 3D input) to output a structured understanding of this field. Data scientists, machine learning engineers, and AI researchers working with large neural networks would use this to understand and apply KD effectively.
Use this if you need to deploy large, high-performing AI models onto resource-constrained devices and are looking for techniques to reduce their runtime and memory footprint without significant performance loss.
Not ideal if you are new to deep learning or neural networks, as it assumes familiarity with advanced AI concepts and model optimization strategies.
Stars
63
Forks
4
Language
—
License
—
Category
Last pushed
Dec 27, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/IPL-sharif/KD_Survey"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)