Knowledge Distillation Compression Transformer Models
There are 4 knowledge distillation compression models tracked. The highest-rated is microsoft/AdaMix at 37/100 with 138 stars.
Get all 4 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=knowledge-distillation-compression&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
microsoft/AdaMix
This is the implementation of the paper AdaMix: Mixture-of-Adaptations for... |
|
Emerging |
| 2 |
pphuc25/distil-cd
Distillation Contrastive Decoding: Improving LLMs Reasoning with Contrastive... |
|
Experimental |
| 3 |
taissirboukrouba/Structured-Information-Retrieval-with-LLMs
Academic Sequence Labelling Between DistillBERT & Encoder-only Transformer |
|
Experimental |
| 4 |
mominalix/LLM-Model-Distillation-for-Text-Classification-Models-GUI
GUI application that performs knowledge distillation from OpenAI models to... |
|
Experimental |