Awesome-Dataset-Distillation and awesome-knowledge-distillation

These are ecosystem siblings within knowledge compression—one curates papers on **dataset distillation** (compressing training data itself) while the other covers **knowledge distillation** (compressing trained models), representing two complementary but distinct subfields of model compression research.

Maintenance 22/25
Adoption 10/25
Maturity 16/25
Community 19/25
Maintenance 9/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 1,909
Forks: 170
Downloads:
Commits (30d): 60
Language: HTML
License: MIT
Stars: 3,825
Forks: 513
Downloads:
Commits (30d): 1
Language:
License: Apache-2.0
No Package No Dependents
No Package No Dependents

About Awesome-Dataset-Distillation

Guang000/Awesome-Dataset-Distillation

A curated list of awesome papers on dataset distillation and related applications.

This project compiles a detailed list of research papers focused on 'dataset distillation'. It's a method for creating a much smaller, synthetic dataset that can be used to train AI models to perform almost as well as if they were trained on the original, much larger dataset. The primary users are machine learning researchers and practitioners who work with large datasets and need to reduce their size for efficiency or other applications.

machine-learning-research data-miniaturization model-training-efficiency continual-learning data-privacy

About awesome-knowledge-distillation

dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

This is a curated collection of research papers focused on 'knowledge distillation,' a technique in machine learning. It helps practitioners who need to make large, complex machine learning models more efficient for real-world deployment. You'll find papers detailing how to compress high-performing but resource-intensive models into smaller, faster ones, while retaining much of their accuracy. This resource is for machine learning engineers, data scientists, or researchers who are optimizing model performance for deployment in resource-constrained environments.

model-optimization machine-learning-deployment deep-learning-efficiency AI-model-compression resource-constrained-AI

Scores updated daily from GitHub, PyPI, and npm data. How scores work