zhengli97/DM-KD
Official PyTorch Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)
This project helps machine learning practitioners train smaller, more efficient image classification models without needing large, real-world datasets. It uses synthetic images generated by advanced diffusion models as teaching material. You provide a 'teacher' model and the synthetic images, and it outputs a 'student' model that achieves strong performance for image classification tasks like identifying objects or flower types.
No commits in the last 6 months.
Use this if you need to train image classification models but lack access to a large, real dataset, and want to leverage synthetic data effectively for knowledge distillation.
Not ideal if you have ample real-world data and are not concerned with reducing model size or training time via distillation with synthetic data.
Stars
48
Forks
2
Language
Python
License
—
Category
Last pushed
Dec 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/zhengli97/DM-KD"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
quantgirluk/aleatory
📦 Python library for Stochastic Processes Simulation and Visualisation
blei-lab/treeffuser
Treeffuser is an easy-to-use package for probabilistic prediction and probabilistic regression...
TuftsBCB/RegDiffusion
Diffusion model for gene regulatory network inference.
yuanchenyang/smalldiffusion
Simple and readable code for training and sampling from diffusion models
chairc/Integrated-Design-Diffusion-Model
IDDM (Industrial, landscape, animate, latent diffusion), support LDM, DDPM, DDIM, PLMS, webui...