hplt-project/monolingual-multilingual-instruction-tuning

Monolingual or Multilingual Instruction Tuning: Which Makes a Better Alpaca

31
/ 100
Emerging

This project helps researchers and developers understand how to build better large language models (LLMs) that can work across many languages. It provides code and pre-translated datasets to compare different training methods for LLMs, showing what happens when you train a model using only one language versus multiple languages. Anyone working on making LLMs perform well in different global markets, or studying how language impacts AI, would find this useful.

Use this if you are a machine learning researcher or developer exploring how to create more effective multilingual large language models.

Not ideal if you are a non-technical user looking for a ready-to-use translated AI model or a simple translation tool.

natural-language-processing large-language-models multilingual-AI machine-learning-research AI-model-training
No License No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

9

Forks

1

Language

Python

License

Last pushed

Feb 16, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/hplt-project/monolingual-multilingual-instruction-tuning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.