LeeSinLiang/microGPT
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
This project helps AI engineers and machine learning researchers understand and build Generative Pre-trained Transformer (GPT) models from the ground up. You provide a dataset of text, and it produces a small, custom GPT model capable of generating new text based on its training. It's ideal for those looking to learn the internals of generative AI and experiment with their own custom language models on consumer-grade hardware.
113 stars.
Use this if you are an AI engineer or researcher who wants to learn how to implement GPT from scratch, or need a lightweight, customizable language model for small-scale applications and experimentation.
Not ideal if you need a production-ready, high-performance GPT model for large-scale applications or expect state-of-the-art text generation quality out-of-the-box.
Stars
113
Forks
16
Language
Python
License
MIT
Category
Last pushed
Oct 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/LeeSinLiang/microGPT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vixhal-baraiya/microgpt-c
The most atomic way to train and inference a GPT in pure, dependency-free C
milanm/AutoGrad-Engine
A complete GPT language model (training and inference) in ~600 lines of pure C#, zero dependencies
dubzdubz/microgpt-ts
A complete GPT built from scratch in TypeScript with zero dependencies
biegehydra/NanoGptDotnet
A miniature large language model (LLM) that generates shakespeare like text written in C#....
ssrhaso/microjpt
The most atomic way to train and run inference for a GPT in 100 lines of pure, dependency-free Julia.