maroxtn/mt5-M2M-comparison
Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-resources-translation-english-yoruba-ef56624d2b75
This project helps machine learning engineers and researchers quickly compare the translation quality of two specific multilingual models, mT5 and M2M, for languages with limited training data. It takes a small dataset of parallel sentences in a low-resource language pair (like Yoruba-English) and outputs a direct comparison of how well each model performs translation.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher evaluating which pre-trained multilingual translation model, specifically mT5 or M2M, is better suited for translating a low-resource language pair.
Not ideal if you need a solution for a high-resource language pair, want to compare different translation models beyond mT5 and M2M, or are looking for a ready-to-use translation application.
Stars
16
Forks
2
Language
Jupyter Notebook
License
—
Category
Last pushed
Jun 25, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/maroxtn/mt5-M2M-comparison"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...