tranquoctrinh/transformer
This is a PyTorch implementation of the Transformer model in the paper Attention is All You Need
This project helps machine learning engineers and researchers implement transformer models for sequence-to-sequence tasks like machine translation. It takes text in one language (like English) as input and outputs the translated text in another language (like Vietnamese). This is designed for those building and experimenting with neural machine translation systems.
No commits in the last 6 months.
Use this if you are a machine learning practitioner looking for a PyTorch implementation of the Transformer model to build custom natural language processing applications, especially for translation.
Not ideal if you need an out-of-the-box translation service or a high-level API for readily available models without deep model customization.
Stars
37
Forks
5
Language
Python
License
MIT
Category
Last pushed
Apr 07, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tranquoctrinh/transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi