howardyclo/pytorch-seq2seq-example

Fully batched seq2seq example based on practical-pytorch, and more extra features.

36
/ 100
Emerging

This project helps natural language processing developers learn how to build and optimize sequence-to-sequence models for tasks like grammatical error correction. It takes grammatically incorrect English sentences as input and produces corrected sentences as output. It is designed for developers who are learning or implementing advanced NLP models in PyTorch.

No commits in the last 6 months.

Use this if you are a PyTorch developer looking for a detailed, batched sequence-to-sequence example with extra features and comprehensive comments to understand model implementation for tasks like grammatical error correction.

Not ideal if you need a production-ready, highly optimized, and feature-rich library like OpenNMT-py for advanced NLP tasks, especially those requiring beam search or CuDNN optimization.

natural-language-processing deep-learning grammatical-error-correction sequence-modeling pytorch-development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 8 / 25
Community 19 / 25

How are scores calculated?

Stars

76

Forks

17

Language

Jupyter Notebook

License

Last pushed

Mar 11, 2018

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/howardyclo/pytorch-seq2seq-example"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.