ankushhKapoor/transformer-from-scratch

Transformer from scratch implementation in PyTorch for Neural Machine Translation. Features manual Multi-Head Attention, Beam Search, and SacreBLEU evaluation.

15
/ 100
Experimental

This project helps machine learning practitioners and researchers understand how neural machine translation works by providing a complete, self-contained implementation of a Transformer model. You can input text in one language, like English, and get its translation in another, such as Italian or Hindi. It's designed for those who want to build and evaluate custom translation models from the ground up.

Use this if you are a machine learning researcher or student aiming to deeply understand and experiment with the Transformer architecture for neural machine translation, rather than just using a pre-built library.

Not ideal if you're a casual user looking for a quick, ready-to-use translation tool or a production system that requires high-throughput, off-the-shelf translation.

neural-machine-translation natural-language-processing deep-learning-research language-modeling sequence-to-sequence
No License No Package No Dependents
Maintenance 6 / 25
Adoption 4 / 25
Maturity 5 / 25
Community 0 / 25

How are scores calculated?

Stars

8

Forks

Language

Python

License

Last pushed

Jan 02, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ankushhKapoor/transformer-from-scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.