maroxtn/mt5-M2M-comparison

Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-resources-translation-english-yoruba-ef56624d2b75

23
/ 100
Experimental

This project helps machine learning engineers and researchers quickly compare the translation quality of two specific multilingual models, mT5 and M2M, for languages with limited training data. It takes a small dataset of parallel sentences in a low-resource language pair (like Yoruba-English) and outputs a direct comparison of how well each model performs translation.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher evaluating which pre-trained multilingual translation model, specifically mT5 or M2M, is better suited for translating a low-resource language pair.

Not ideal if you need a solution for a high-resource language pair, want to compare different translation models beyond mT5 and M2M, or are looking for a ready-to-use translation application.

Machine-translation Low-resource-languages NLP-model-evaluation Language-AI-development AI-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 9 / 25

How are scores calculated?

Stars

16

Forks

2

Language

Jupyter Notebook

License

Last pushed

Jun 25, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/maroxtn/mt5-M2M-comparison"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.