omarsar/pytorch_neural_machine_translation_attention

Neural Machine Translation with Attention (PyTorch)

42
/ 100
Emerging

This project helps machine learning engineers and researchers explore how to build and train a neural machine translation model. You input text in one language (like Spanish) and it outputs the translated text in another language (like English). It's designed for those who want to understand the inner workings of an attention-based sequence-to-sequence model using PyTorch.

No commits in the last 6 months.

Use this if you are a deep learning practitioner interested in the PyTorch implementation of neural machine translation with attention.

Not ideal if you need an out-of-the-box translation service for end-users, or if you are not comfortable with deep learning frameworks and code.

neural machine translation sequence-to-sequence models attention mechanisms deep learning research natural language processing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

44

Forks

15

Language

Jupyter Notebook

License

MIT

Last pushed

Nov 13, 2018

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/omarsar/pytorch_neural_machine_translation_attention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.