mynlp/pnmt

Pre-train support for OpenNMT (PNMT)

36
/ 100
Emerging

This project helps researchers and practitioners in natural language processing to build and train neural machine translation models. It allows you to incorporate advanced pre-trained language models like BERT into your translation workflows. You input parallel text data, configure your model to use BERT for word embeddings or as an encoder, and the output is a more robust and potentially higher-quality translation model.

No commits in the last 6 months.

Use this if you are developing or experimenting with neural machine translation systems and want to leverage the power of pre-trained BERT models to improve translation quality or research new architectures.

Not ideal if you are looking for a plug-and-play translation tool for immediate use, as this is a research-focused extension requiring familiarity with OpenNMT.

neural-machine-translation natural-language-processing text-translation NLP-research language-modeling
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

32

Forks

5

Language

Python

License

MIT

Last pushed

Aug 13, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/mynlp/pnmt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.