FareedKhan-dev/Understanding-Transformers-Step-by-Step-math-example
Understanding Large Language Transformer Architecture like a child
This project provides a detailed, step-by-step mathematical guide to how large language model Transformers work. It walks through the process from raw text sentences to numerical representations, and finally to the attention mechanisms. It's intended for anyone who wants to understand the underlying calculations of modern AI models without needing to write code.
No commits in the last 6 months.
Use this if you are a student, researcher, or AI enthusiast looking for a concrete, numerical walkthrough of Transformer architecture, starting from basic text data.
Not ideal if you're looking for a theoretical overview without mathematical examples, or a high-level conceptual explanation of how to apply Transformers in a practical setting.
Stars
28
Forks
8
Language
—
License
—
Category
Last pushed
Apr 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/FareedKhan-dev/Understanding-Transformers-Step-by-Step-math-example"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action