awesome-knowledge-distillation and Awesome-Knowledge-Distillation
These are **competitors** — both are curated collections of knowledge distillation papers and resources intended to serve the same purpose of surveying the field, with the first being English-focused and more actively maintained (higher star count), while the second is Chinese-annotated and covers a specific 2014-2021 timeframe.
About awesome-knowledge-distillation
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
This is a curated collection of research papers focused on 'knowledge distillation,' a technique in machine learning. It helps practitioners who need to make large, complex machine learning models more efficient for real-world deployment. You'll find papers detailing how to compress high-performing but resource-intensive models into smaller, faster ones, while retaining much of their accuracy. This resource is for machine learning engineers, data scientists, or researchers who are optimizing model performance for deployment in resource-constrained environments.
About Awesome-Knowledge-Distillation
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
This collection helps machine learning practitioners find relevant research papers on 'knowledge distillation' — a technique to transfer knowledge from large, complex models to smaller, more efficient ones. It takes a research problem or interest in model optimization as input and provides an organized list of academic papers covering different methods and applications of knowledge distillation. Data scientists and ML engineers who need to deploy performant yet lightweight models would find this valuable.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work