Nithin-Holla/MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
This project helps machine learning researchers and practitioners evaluate and develop lifelong learning models for natural language processing tasks. It takes text datasets for classification (like news articles or product reviews) or relation extraction as input. The output is a trained model capable of continually learning new tasks without forgetting previous ones, which is crucial for systems that need to adapt over time.
No commits in the last 6 months.
Use this if you are a researcher or NLP engineer working on models that need to adapt to new language tasks sequentially, without losing proficiency on tasks learned earlier.
Not ideal if you are looking for a plug-and-play solution for a single, static text classification or relation extraction task.
Stars
22
Forks
4
Language
Python
License
MIT
Category
Last pushed
Jun 12, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Nithin-Holla/MetaLifelongLanguage"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
debjitpaul/refiner
About The corresponding code from our paper " REFINER: Reasoning Feedback on Intermediate...
THUDM/P-tuning
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
ZixuanKe/PyContinual
PyContinual (An Easy and Extendible Framework for Continual Learning)
arazd/ProgressivePrompts
Progressive Prompts: Continual Learning for Language Models
SALT-NLP/IDBR
Codes for the paper: "Continual Learning for Text Classification with Information...