NeurAI-Lab/CLS-ER

The official PyTorch code for ICLR'22 Paper "Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System""

39
/ 100
Emerging

This project offers a method for deep learning models to continuously learn new information without forgetting previously learned knowledge. It takes in datasets that represent sequential learning tasks and outputs improved model performance metrics on these tasks, particularly in scenarios where data arrives over time. This is intended for machine learning researchers and practitioners who develop and deploy models that need to adapt to evolving data streams.

No commits in the last 6 months.

Use this if you are developing AI models that must learn new tasks incrementally while retaining performance on older tasks, for example, in scenarios with evolving data or new product categories.

Not ideal if your models are trained once on a static dataset and do not require ongoing adaptation or learning from sequential information.

continual learning lifelong learning machine learning research model adaptation sequential data processing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

53

Forks

8

Language

Python

License

MIT

Last pushed

Aug 07, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NeurAI-Lab/CLS-ER"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.