AuCson/PyTorch-Batch-Attention-Seq2seq
PyTorch implementation of batched bi-RNN encoder and attention-decoder.
This is a tool for developers working on natural language processing tasks, specifically machine translation or sequence-to-sequence problems. It provides a faster way to process large sets of text (batches) through a specific neural network architecture, taking in sequences of words and outputting translated or transformed sequences. Developers building their own language models would use this.
281 stars. No commits in the last 6 months.
Use this if you are a developer building a sequence-to-sequence model in PyTorch and need a highly optimized, batched implementation of a bi-directional RNN encoder and attention decoder.
Not ideal if you are looking for a high-level, off-the-shelf machine translation system, as this is a lower-level component for model development.
Stars
281
Forks
45
Language
Python
License
—
Category
Last pushed
Jan 18, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/AuCson/PyTorch-Batch-Attention-Seq2seq"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...