EnnengYang/Awesome-Forgetting-in-Deep-Learning

A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning. TPAMI, 2024.

39
/ 100
Emerging

This resource helps machine learning researchers and practitioners understand and address the phenomenon of 'forgetting' in deep learning models, beyond just continual learning. It compiles a comprehensive list of research papers on how models lose previously learned information across various deep learning applications like generative models and federated learning. Researchers looking to build more robust or privacy-aware AI systems would use this collection.

352 stars.

Use this if you are a deep learning researcher or practitioner struggling with models losing past knowledge when trained on new data or tasks, or if you are exploring how forgetting can be leveraged for beneficial outcomes like privacy preservation.

Not ideal if you are looking for an introduction to deep learning fundamentals or a step-by-step guide on how to implement specific machine learning algorithms.

deep-learning-research continual-learning generative-models federated-learning model-robustness machine-unlearning
No License No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 11 / 25

How are scores calculated?

Stars

352

Forks

17

Language

License

Last pushed

Jan 27, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/EnnengYang/Awesome-Forgetting-in-Deep-Learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.