stevezheng23/seq2seq_tf
Sequence-to-Sequence in Tensorflow
This project helps machine learning engineers or researchers build and train sequence-to-sequence models for natural language processing tasks. It takes pairs of input and output sequences (like English and Vietnamese sentences) and produces a trained model that can generate a target sequence from a new source sequence. This is useful for anyone working on machine translation, text summarization, or question answering systems.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to experiment with traditional Recurrent Neural Network (RNN) based sequence-to-sequence models, including vanilla and attention-based architectures, using TensorFlow 1.x.
Not ideal if you need state-of-the-art performance, want to use more modern Transformer architectures, or are looking for a pre-trained model ready for immediate deployment.
Stars
8
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 19, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/stevezheng23/seq2seq_tf"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
facebookresearch/fairseq2
FAIR Sequence Modeling Toolkit 2
lhotse-speech/lhotse
Tools for handling multimodal data in machine learning projects.
google/sequence-layers
A neural network layer API and library for sequence modeling, designed for easy creation of...
awslabs/sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
OpenNMT/OpenNMT-tf
Neural machine translation and sequence learning using TensorFlow