CownowAn/DaSS

Official PyTorch implementation of "Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets" (ICLR 2023 notable top 25%)

22
/ 100
Experimental

This project helps machine learning engineers and researchers efficiently find the best performing student neural network architecture for a given teacher model and dataset. Instead of time-consuming trial-and-error training, it takes information about your teacher model and target dataset and predicts how well a potential student architecture will perform with knowledge distillation, enabling faster identification of optimal student networks. The primary users are ML engineers or researchers working on model compression and optimization.

No commits in the last 6 months.

Use this if you need to quickly find an efficient student neural network that performs well when learning from a larger, more complex teacher model on a new, unseen dataset.

Not ideal if you are training models from scratch without using knowledge distillation from a pre-existing teacher model.

deep-learning model-optimization neural-architecture-search knowledge-distillation machine-learning-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

26

Forks

2

Language

Python

License

Last pushed

Mar 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/CownowAn/DaSS"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.