EnnengYang/Awesome-Forgetting-in-Deep-Learning
A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning. TPAMI, 2024.
This resource helps machine learning researchers and practitioners understand and address the phenomenon of 'forgetting' in deep learning models, beyond just continual learning. It compiles a comprehensive list of research papers on how models lose previously learned information across various deep learning applications like generative models and federated learning. Researchers looking to build more robust or privacy-aware AI systems would use this collection.
352 stars.
Use this if you are a deep learning researcher or practitioner struggling with models losing past knowledge when trained on new data or tasks, or if you are exploring how forgetting can be leveraged for beneficial outcomes like privacy preservation.
Not ideal if you are looking for an introduction to deep learning fundamentals or a step-by-step guide on how to implement specific machine learning algorithms.
Stars
352
Forks
17
Language
—
License
—
Category
Last pushed
Jan 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/EnnengYang/Awesome-Forgetting-in-Deep-Learning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aimagelab/mammoth
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of...
LAMDA-CL/PyCIL
PyCIL: A Python Toolbox for Class-Incremental Learning
GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR,...
LAMDA-CL/LAMDA-PILOT
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
mmasana/FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.