joeljang/continual-knowledge-learning

[ICLR 2022] Towards Continual Knowledge Learning of Language Models

27
/ 100
Experimental

This project helps machine learning researchers and engineers reproduce the experimental results for 'Towards Continual Knowledge Learning of Language Models' from the ICLR 2022 paper. It allows you to download specific datasets and pre-trained model checkpoints. The output is a trained language model that can be evaluated on various benchmarks, demonstrating continual learning capabilities.

No commits in the last 6 months.

Use this if you are a researcher or ML engineer interested in continually updating large language models with new information without forgetting old knowledge.

Not ideal if you are looking for a plug-and-play solution for an application or a library to integrate into existing systems without deep knowledge of model training.

continual learning language models model training research reproduction knowledge updating
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 8 / 25
Community 10 / 25

How are scores calculated?

Stars

91

Forks

7

Language

Python

License

Last pushed

Oct 11, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/joeljang/continual-knowledge-learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.