Knowledge Distillation Compression Diffusion Models

There are 2 knowledge distillation compression models tracked. The highest-rated is qitianwu/DIFFormer at 34/100 with 313 stars.

Get all 2 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=diffusion&subcategory=knowledge-distillation-compression&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Model Score Tier
1 qitianwu/DIFFormer

The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable...

34
Emerging
2 amazon-science/crossnorm-selfnorm

CrossNorm and SelfNorm for Generalization under Distribution Shifts, ICCV 2021

34
Emerging