SALT-NLP/Adaptive-Compositional-Modules

Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"

29
/ 100
Experimental

This project offers a way for machine learning researchers to train natural language processing models that can learn new tasks over time without forgetting previously learned information. You provide text data for a series of tasks, and the system trains a model that adapts and grows its knowledge incrementally, allowing for continuous learning. This is ideal for researchers working on lifelong learning or continual learning challenges in NLP.

No commits in the last 6 months.

Use this if you are an NLP researcher developing language models and need a framework for continually adding new knowledge and tasks without incurring 'catastrophic forgetting' or needing to retrain from scratch.

Not ideal if you are a practitioner looking for a ready-to-use NLP application or if you don't have experience with deep learning research frameworks like Hugging Face Transformers.

continual learning lifelong learning natural language processing research sequence generation machine learning research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

39

Forks

2

Language

Python

License

MIT

Last pushed

Apr 04, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/SALT-NLP/Adaptive-Compositional-Modules"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.