omarsar/pytorch_neural_machine_translation_attention
Neural Machine Translation with Attention (PyTorch)
This project helps machine learning engineers and researchers explore how to build and train a neural machine translation model. You input text in one language (like Spanish) and it outputs the translated text in another language (like English). It's designed for those who want to understand the inner workings of an attention-based sequence-to-sequence model using PyTorch.
No commits in the last 6 months.
Use this if you are a deep learning practitioner interested in the PyTorch implementation of neural machine translation with attention.
Not ideal if you need an out-of-the-box translation service for end-users, or if you are not comfortable with deep learning frameworks and code.
Stars
44
Forks
15
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Nov 13, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/omarsar/pytorch_neural_machine_translation_attention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...