tanjeffreyz/attention-is-all-you-need
PyTorch implementation of "Attention Is All You Need" by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
This project offers a practical example of a transformer model for language translation. It takes in text in one language (like English) and translates it into another language (like German), producing translated text as output. This is useful for researchers or students who want to understand and experiment with the core 'Attention Is All You Need' architecture.
No commits in the last 6 months.
Use this if you are a machine learning researcher or student looking to study, reproduce, or build upon the Transformer architecture for sequence-to-sequence tasks like machine translation.
Not ideal if you are a practitioner needing a production-ready, highly optimized machine translation system for immediate deployment.
Stars
18
Forks
1
Language
Python
License
—
Category
Last pushed
Aug 26, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/tanjeffreyz/attention-is-all-you-need"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...