stevezheng23/seq2seq_tf

Sequence-to-Sequence in Tensorflow

20
/ 100
Experimental

This project helps machine learning engineers or researchers build and train sequence-to-sequence models for natural language processing tasks. It takes pairs of input and output sequences (like English and Vietnamese sentences) and produces a trained model that can generate a target sequence from a new source sequence. This is useful for anyone working on machine translation, text summarization, or question answering systems.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking to experiment with traditional Recurrent Neural Network (RNN) based sequence-to-sequence models, including vanilla and attention-based architectures, using TensorFlow 1.x.

Not ideal if you need state-of-the-art performance, want to use more modern Transformer architectures, or are looking for a pre-trained model ready for immediate deployment.

Machine Translation Text Summarization Natural Language Processing Deep Learning Research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

8

Forks

Language

Python

License

Apache-2.0

Last pushed

Apr 19, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/stevezheng23/seq2seq_tf"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.