AuCson/PyTorch-Batch-Attention-Seq2seq

PyTorch implementation of batched bi-RNN encoder and attention-decoder.

37
/ 100
Emerging

This is a tool for developers working on natural language processing tasks, specifically machine translation or sequence-to-sequence problems. It provides a faster way to process large sets of text (batches) through a specific neural network architecture, taking in sequences of words and outputting translated or transformed sequences. Developers building their own language models would use this.

281 stars. No commits in the last 6 months.

Use this if you are a developer building a sequence-to-sequence model in PyTorch and need a highly optimized, batched implementation of a bi-directional RNN encoder and attention decoder.

Not ideal if you are looking for a high-level, off-the-shelf machine translation system, as this is a lower-level component for model development.

natural-language-processing machine-translation neural-networks deep-learning sequence-modeling
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 19 / 25

How are scores calculated?

Stars

281

Forks

45

Language

Python

License

Last pushed

Jan 18, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/AuCson/PyTorch-Batch-Attention-Seq2seq"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.