ZixuanKe/PyContinual
PyContinual (An Easy and Extendible Framework for Continual Learning)
This framework helps machine learning researchers and practitioners develop and evaluate models that can continually learn new information without forgetting previously learned knowledge. It takes in various language and image datasets and allows you to experiment with different continual learning baselines and neural network architectures. Researchers focused on advancing or applying continual learning techniques would find this useful.
325 stars. No commits in the last 6 months.
Use this if you are developing or experimenting with continual learning algorithms for natural language processing or image classification tasks and need a standardized way to compare different approaches.
Not ideal if you are looking for a ready-to-use, production-grade application or a tool for general machine learning tasks beyond continual learning research.
Stars
325
Forks
69
Language
Python
License
—
Category
Last pushed
Jan 29, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ZixuanKe/PyContinual"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
debjitpaul/refiner
About The corresponding code from our paper " REFINER: Reasoning Feedback on Intermediate...
THUDM/P-tuning
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
arazd/ProgressivePrompts
Progressive Prompts: Continual Learning for Language Models
Nithin-Holla/MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for...
SALT-NLP/IDBR
Codes for the paper: "Continual Learning for Text Classification with Information...