mytechnotalent/RE-GPT
Inspired by Andrej Karpathy’s "Let’s Build GPT", this project guides you step‑by‑step to build a GPT from scratch, demystifying its architecture through clear, hands‑on code.
This project guides machine learning practitioners step-by-step through building a Generative Pre-trained Transformer (GPT) model from scratch. It takes in raw text data and outputs a functional, foundational language model. The ideal user is a data scientist, machine learning engineer, or AI researcher who wants to deeply understand the mechanics of large language models.
Use this if you are a machine learning professional who wants a hands-on, code-driven tutorial to grasp the inner workings of GPT architecture and self-attention mechanisms.
Not ideal if you are looking for a pre-built library to quickly implement a GPT model without understanding its core components, or if you are not comfortable with Python and PyTorch.
Stars
27
Forks
5
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Dec 12, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mytechnotalent/RE-GPT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...