ankushhKapoor/transformer-from-scratch
Transformer from scratch implementation in PyTorch for Neural Machine Translation. Features manual Multi-Head Attention, Beam Search, and SacreBLEU evaluation.
This project helps machine learning practitioners and researchers understand how neural machine translation works by providing a complete, self-contained implementation of a Transformer model. You can input text in one language, like English, and get its translation in another, such as Italian or Hindi. It's designed for those who want to build and evaluate custom translation models from the ground up.
Use this if you are a machine learning researcher or student aiming to deeply understand and experiment with the Transformer architecture for neural machine translation, rather than just using a pre-built library.
Not ideal if you're a casual user looking for a quick, ready-to-use translation tool or a production system that requires high-throughput, off-the-shelf translation.
Stars
8
Forks
—
Language
Python
License
—
Category
Last pushed
Jan 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ankushhKapoor/transformer-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lvapeab/nmt-keras
Neural Machine Translation with Keras
dair-ai/Transformers-Recipe
🧠A study guide to learn about Transformers
SirawitC/Transformer_from_scratch_pytorch
Build a transformer model from scratch using pytorch to understand its inner workings and gain...
jaketae/ensemble-transformers
Ensembling Hugging Face transformers made easy
lof310/transformer
PyTorch implementation of the current SOTA Transformer. Configurable, efficient, and...