Awesome-Knowledge-Distillation and awesome-knowledge-distillation-for-object-detection
These are ecosystem siblings—one is a broad knowledge-distillation resource covering general techniques across domains (2014-2021), while the other is a specialized subset focusing specifically on the object-detection application of those same distillation methods.
About Awesome-Knowledge-Distillation
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
This collection helps machine learning practitioners find relevant research papers on 'knowledge distillation' — a technique to transfer knowledge from large, complex models to smaller, more efficient ones. It takes a research problem or interest in model optimization as input and provides an organized list of academic papers covering different methods and applications of knowledge distillation. Data scientists and ML engineers who need to deploy performant yet lightweight models would find this valuable.
About awesome-knowledge-distillation-for-object-detection
LutingWang/awesome-knowledge-distillation-for-object-detection
A curated list of awesome knowledge distillation papers and codes for object detection.
This is a curated list of research papers and associated code for 'knowledge distillation' techniques in object detection. These techniques help make object detection models — which identify and locate objects within images or videos — more efficient and faster, especially for deployment on devices with limited computing power. The resource is for researchers, machine learning engineers, and computer vision scientists working to optimize and deploy object detection systems.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work