yhy1117/X-Mixup

Implementation of ICLR 2022 paper "Enhancing Cross-lingual Transfer by Manifold Mixup".

22
/ 100
Experimental

This project helps natural language processing researchers evaluate and improve how well models trained on one language perform on other languages. It takes existing multilingual datasets for tasks like natural language inference, question answering, or part-of-speech tagging and outputs performance metrics showing the cross-lingual transfer capabilities of different models. A researcher focused on multilingual NLP model development would use this to understand and enhance model generalization across languages.

No commits in the last 6 months.

Use this if you are an NLP researcher working on making language models perform better across multiple languages, especially for tasks like classification or question answering.

Not ideal if you are looking for a general-purpose library for building new NLP models from scratch or if you only work with single-language datasets.

multilingual NLP cross-lingual transfer natural language understanding NLP research model evaluation
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

21

Forks

2

Language

Python

License

Last pushed

May 25, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yhy1117/X-Mixup"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.