chenllliang/MLS
Source code of our paper "Focus on the Target’s Vocabulary: Masked Label Smoothing for Machine Translation" @ACL-2022
This project offers a refined approach to improve the accuracy of machine translation systems. By optimizing how translation models handle vocabulary between different languages, it takes text in one language and produces a more precise translation in another. This is for researchers and engineers developing new machine translation models or enhancing existing ones.
No commits in the last 6 months.
Use this if you are developing or fine-tuning neural machine translation models and want to enhance translation quality and model calibration.
Not ideal if you are looking for an off-the-shelf translation tool for everyday use or are not familiar with machine learning model development.
Stars
18
Forks
4
Language
Python
License
—
Category
Last pushed
May 19, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/chenllliang/MLS"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
n-waves/multifit
The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model...
princeton-nlp/SimCSE
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
yxuansu/SimCTG
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
alibaba-edu/simple-effective-text-matching
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
Shark-NLP/OpenICL
OpenICL is an open-source framework to facilitate research, development, and prototyping of...