fangpin/llm-from-scratch

Build LLM from scratch

41
/ 100
Emerging

This project helps machine learning engineers and researchers understand how large language models (LLMs) are built from the ground up. It takes raw text data, processes it, and outputs a custom-trained LLM capable of generating new text or performing fine-tuned tasks. The primary users are those who want to learn the internal workings of modern LLMs by implementing them from scratch.

Use this if you are a machine learning engineer or researcher who wants a clear, modular, and educational resource to build and understand a decoder-only Transformer model and its components.

Not ideal if you are looking for a pre-built LLM to use directly for application development, or if you want to fine-tune an existing large model without understanding its core architecture.

Large Language Model Development Deep Learning Architecture Natural Language Processing Research Transformer Models Machine Learning Education
No Package No Dependents
Maintenance 6 / 25
Adoption 9 / 25
Maturity 15 / 25
Community 11 / 25

How are scores calculated?

Stars

97

Forks

8

Language

Python

License

MIT

Last pushed

Nov 19, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/fangpin/llm-from-scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.