vixhal-baraiya/microgpt-c
The most atomic way to train and inference a GPT in pure, dependency-free C
This project offers a foundational way to create and use a GPT model. It takes text data for training and generates new text samples based on what it has learned. This is for researchers, hobbyists, or educators interested in the absolute basics of how a GPT model works from the ground up.
234 stars.
Use this if you want to understand, teach, or experiment with the core mechanics of a Generative Pre-trained Transformer (GPT) model without any complex libraries or dependencies.
Not ideal if you need to build a production-ready GPT application, require advanced features like fine-tuning large models, or prefer working with high-level AI frameworks.
Stars
234
Forks
44
Language
C
License
MIT
Category
Last pushed
Feb 15, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/vixhal-baraiya/microgpt-c"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
milanm/AutoGrad-Engine
A complete GPT language model (training and inference) in ~600 lines of pure C#, zero dependencies
LeeSinLiang/microGPT
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
dubzdubz/microgpt-ts
A complete GPT built from scratch in TypeScript with zero dependencies
biegehydra/NanoGptDotnet
A miniature large language model (LLM) that generates shakespeare like text written in C#....
ssrhaso/microjpt
The most atomic way to train and run inference for a GPT in 100 lines of pure, dependency-free Julia.