christinakim/scaling-laws-for-language-transfer

code for Scaling Laws for Language Transfer Learning

21
/ 100
Experimental

This project helps machine learning researchers understand how much pre-training a language model on English data benefits its performance when fine-tuning for other languages like Chinese, Spanish, or German. It takes existing English pre-trained models and then applies them to non-English text datasets, producing insights into the 'scaling laws' of language transfer. A machine learning scientist or researcher focused on multilingual natural language processing would find this useful.

No commits in the last 6 months.

Use this if you are a machine learning researcher exploring the optimal strategies for developing multilingual language models, specifically concerning the role of English pre-training.

Not ideal if you are a practitioner looking for a ready-to-use, production-grade multilingual model for direct application.

multilingual-nlp transfer-learning language-model-research cross-lingual-ai nlp-scaling-laws
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

9

Forks

1

Language

Python

License

Last pushed

Apr 18, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/christinakim/scaling-laws-for-language-transfer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.