LeeSinLiang/microGPT

Implementation of GPT from scratch. Design to be lightweight and easy to modify.

46
/ 100
Emerging

This project helps AI engineers and machine learning researchers understand and build Generative Pre-trained Transformer (GPT) models from the ground up. You provide a dataset of text, and it produces a small, custom GPT model capable of generating new text based on its training. It's ideal for those looking to learn the internals of generative AI and experiment with their own custom language models on consumer-grade hardware.

113 stars.

Use this if you are an AI engineer or researcher who wants to learn how to implement GPT from scratch, or need a lightweight, customizable language model for small-scale applications and experimentation.

Not ideal if you need a production-ready, high-performance GPT model for large-scale applications or expect state-of-the-art text generation quality out-of-the-box.

generative-AI machine-learning-research natural-language-processing model-training AI-education
No Package No Dependents
Maintenance 6 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

113

Forks

16

Language

Python

License

MIT

Last pushed

Oct 16, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/LeeSinLiang/microGPT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.