vorobeevich/distillation-in-dg
Implementation of "Weight Averaging Improves Knowledge Distillation under Domain Shift" (ICCV 2023 OOD-CV Workshop)
This project helps machine learning engineers improve the performance of their image classification models when the models encounter new, unseen image styles or sources. It takes a pre-trained 'teacher' model and a 'student' model, along with image datasets that span different visual domains. The output is a 'student' model that maintains its accuracy better when applied to images from domains it wasn't explicitly trained on.
No commits in the last 6 months.
Use this if you need to deploy an image classification model that can reliably perform across diverse visual environments or datasets without significant performance drops.
Not ideal if your image classification models only operate on data from a single, consistent visual domain and never encounter variations.
Stars
19
Forks
1
Language
Python
License
MIT
Category
Last pushed
Oct 01, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/vorobeevich/distillation-in-dg"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.