DefangChen/Knowledge-Distillation-Paper
This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
This collection provides a curated list of research papers on knowledge distillation, a technique to compress large, complex deep learning models into smaller, more efficient ones without significant loss of performance. It takes academic papers as input and helps identify pioneering works, survey articles, and specific applications like accelerating diffusion models or improving segmentation. Machine learning researchers, practitioners, and students focused on model compression and efficiency would find this useful.
No commits in the last 6 months.
Use this if you need to quickly find relevant academic papers to understand, apply, or research knowledge distillation techniques in deep learning.
Not ideal if you are looking for code implementations, tutorials, or a high-level conceptual overview without diving into academic literature.
Stars
84
Forks
16
Language
—
License
—
Category
Last pushed
Mar 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/DefangChen/Knowledge-Distillation-Paper"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lixinustc/Awesome-diffusion-model-for-image-processing
one summary of diffusion-based image processing, including restoration, enhancement, coding,...
showlab/Awesome-Video-Diffusion
A curated list of recent diffusion models for video generation, editing, and various other applications.
xlite-dev/Awesome-DiT-Inference
📚A curated list of Awesome Diffusion Inference Papers with Codes: Sampling, Cache, Quantization,...
wangkai930418/awesome-diffusion-categorized
collection of diffusion model papers categorized by their subareas
ChenHsing/Awesome-Video-Diffusion-Models
[CSUR] A Survey on Video Diffusion Models