NeurAI-Lab/CLS-ER
The official PyTorch code for ICLR'22 Paper "Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System""
This project offers a method for deep learning models to continuously learn new information without forgetting previously learned knowledge. It takes in datasets that represent sequential learning tasks and outputs improved model performance metrics on these tasks, particularly in scenarios where data arrives over time. This is intended for machine learning researchers and practitioners who develop and deploy models that need to adapt to evolving data streams.
No commits in the last 6 months.
Use this if you are developing AI models that must learn new tasks incrementally while retaining performance on older tasks, for example, in scenarios with evolving data or new product categories.
Not ideal if your models are trained once on a static dataset and do not require ongoing adaptation or learning from sequential information.
Stars
53
Forks
8
Language
Python
License
MIT
Category
Last pushed
Aug 07, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NeurAI-Lab/CLS-ER"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aimagelab/mammoth
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of...
LAMDA-CL/PyCIL
PyCIL: A Python Toolbox for Class-Incremental Learning
GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR,...
LAMDA-CL/LAMDA-PILOT
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
mmasana/FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.