gentaiscool/few-shot-lm

The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)

29
/ 100
Experimental

This project helps natural language processing researchers evaluate and fine-tune multilingual language models, especially when dealing with limited training data. It takes an existing language model and a small amount of data in one language (or across multiple languages) and produces an improved model capable of understanding and generating text in different languages, even for tasks it hasn't seen much of. NLP researchers or computational linguists focused on cross-lingual transfer learning would find this tool useful.

No commits in the last 6 months.

Use this if you need to adapt large language models to new languages or tasks with minimal examples, aiming for strong performance in multilingual settings.

Not ideal if you are looking for a pre-trained, ready-to-use multilingual model without needing to perform further research or adaptation.

multilingual-NLP few-shot-learning cross-lingual-transfer language-model-adaptation computational-linguistics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

53

Forks

2

Language

Python

License

Apache-2.0

Last pushed

Jun 12, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gentaiscool/few-shot-lm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.