cpm0722/transformer_pytorch

Transformer(Attention Is All You Need) Implementation in Pytorch

36
/ 100
Emerging

This is a foundational code implementation of the Transformer neural network architecture in PyTorch, based on the 'Attention Is All You Need' paper. It demonstrates how to build and train a Transformer model for tasks like language translation. Researchers and students in natural language processing or deep learning looking to understand or build upon this architecture would find this useful.

No commits in the last 6 months.

Use this if you are a researcher or student in machine learning and want to study, reproduce, or build upon the Transformer architecture for sequence-to-sequence tasks, particularly for neural machine translation.

Not ideal if you need a high-level library to apply a pre-trained Transformer model directly, or if you're not comfortable working with raw PyTorch code for model implementation.

neural-networks deep-learning natural-language-processing machine-translation AI-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 8 / 25
Community 19 / 25

How are scores calculated?

Stars

73

Forks

17

Language

Python

License

Last pushed

Dec 02, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/cpm0722/transformer_pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.