ssrhaso/microjpt

The most atomic way to train and run inference for a GPT in 100 lines of pure, dependency-free Julia.

38
/ 100
Emerging

This project offers a highly efficient way to train and use a simple language model for generating text from scratch. You provide a text dataset, and the system learns patterns to produce new, similar text. It's designed for developers interested in the underlying mechanics of language models, especially those working with the Julia programming language.

Use this if you are a developer looking to understand or implement a basic Generative Pre-trained Transformer (GPT) model from first principles, without external machine learning frameworks, using Julia.

Not ideal if you need a production-ready large language model, require extensive features beyond basic text generation, or prefer to work within established machine learning frameworks.

Julia programming text generation machine learning implementation model training performance optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 9 / 25
Maturity 11 / 25
Community 8 / 25

How are scores calculated?

Stars

98

Forks

5

Language

Julia

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ssrhaso/microjpt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.