zzz47zzz/codebase-for-incremental-learning-with-llm

[ACL2024] A Codebase for Incremental Learning with Large Language Models; Official released code for "Learn or Recall? Revisiting Incremental Learning with Pre-trained Language Models (ACL 2024)", "Incremental Sequence Labeling: A Tale of Two Shifts (ACL 2024 Findings)", and "Concept-1K: A Novel Benchmark for Instance Incremental Learning (arxiv)"

31
/ 100
Emerging

This project helps researchers and practitioners in natural language processing to develop and test models that can learn new information over time without forgetting what they've already learned. It takes various text datasets (like text for classification, intent, or named entity recognition) and implements incremental learning techniques using large language models. The output is a more adaptable and robust language model that performs well on new tasks while retaining knowledge from old ones, ideal for AI/ML researchers, data scientists, and language model developers.

No commits in the last 6 months.

Use this if you are developing or evaluating large language models that need to continuously learn from new data streams without suffering from 'catastrophic forgetting' on previously learned tasks.

Not ideal if you are looking for a pre-trained, ready-to-use application, as this is a research codebase requiring technical expertise to set up and run experiments.

Natural Language Processing Continual Learning Large Language Models Machine Learning Research AI Model Development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

60

Forks

9

Language

Python

License

Last pushed

Feb 01, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/zzz47zzz/codebase-for-incremental-learning-with-llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.