SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
This tool helps machine learning engineers and researchers make their trained neural networks smaller and faster, without significantly losing accuracy. You provide a large, high-performing 'teacher' model and a smaller 'student' model, along with your training data. The output is a more compact, efficient 'student' model that mimics the teacher's performance.
652 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to deploy complex deep learning models to environments with limited computational resources, such as mobile devices or edge hardware.
Not ideal if you are a business user without a deep understanding of machine learning model architectures and training processes.
Stars
652
Forks
61
Language
Python
License
MIT
Category
Last pushed
Mar 01, 2023
Commits (30d)
0
Dependencies
26
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SforAiDl/KD_Lib"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)