ssrhaso/microjpt
The most atomic way to train and run inference for a GPT in 100 lines of pure, dependency-free Julia.
This project offers a highly efficient way to train and use a simple language model for generating text from scratch. You provide a text dataset, and the system learns patterns to produce new, similar text. It's designed for developers interested in the underlying mechanics of language models, especially those working with the Julia programming language.
Use this if you are a developer looking to understand or implement a basic Generative Pre-trained Transformer (GPT) model from first principles, without external machine learning frameworks, using Julia.
Not ideal if you need a production-ready large language model, require extensive features beyond basic text generation, or prefer to work within established machine learning frameworks.
Stars
98
Forks
5
Language
Julia
License
MIT
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ssrhaso/microjpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
vixhal-baraiya/microgpt-c
The most atomic way to train and inference a GPT in pure, dependency-free C
milanm/AutoGrad-Engine
A complete GPT language model (training and inference) in ~600 lines of pure C#, zero dependencies
LeeSinLiang/microGPT
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
dubzdubz/microgpt-ts
A complete GPT built from scratch in TypeScript with zero dependencies
biegehydra/NanoGptDotnet
A miniature large language model (LLM) that generates shakespeare like text written in C#....