zwhe99/SelfTraining4UNMT

[ACL 2022] Bridging the Data Gap between Training and Inference for Unsupervised Neural Machine Translation

19
/ 100
Experimental

This project offers an improved method for Unsupervised Neural Machine Translation (UNMT), which translates text between languages without needing parallel sentences (text translated by humans). It takes monolingual text data in two languages and produces a machine translation model capable of translating between them. This is primarily useful for researchers and machine translation engineers working on advanced language processing systems.

No commits in the last 6 months.

Use this if you are a researcher or engineer looking to advance the state-of-the-art in unsupervised neural machine translation, particularly for English-French, English-German, or English-Romanian language pairs.

Not ideal if you need an out-of-the-box translation service for general business use, as this requires significant technical expertise to set up and run.

machine-translation natural-language-processing AI-research computational-linguistics language-AI-development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 4 / 25

How are scores calculated?

Stars

31

Forks

1

Language

Python

License

Last pushed

Oct 06, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/zwhe99/SelfTraining4UNMT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.