clabrugere/scratch-llm

Implements a LLM similar to Meta's Llama 2 from the ground up in PyTorch, for educational purposes.

40
/ 100
Emerging

This project offers a clear, basic implementation of a large language model like Meta's Llama, built using PyTorch. It helps developers and researchers understand how these models work internally by showing the mechanics of components like positional encoding and attention. The project takes text data, processes it, and demonstrates the core computational steps that lead to a trained language model.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who wants to learn the fundamental building blocks and internal workings of a Llama-like large language model from scratch, without optimization complexities.

Not ideal if you need a high-performance, production-ready language model for real-world applications or require advanced training and inference optimizations.

deep-learning-education natural-language-processing machine-learning-engineering neural-network-architecture LLM-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

38

Forks

9

Language

Python

License

MIT

Last pushed

Feb 07, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/clabrugere/scratch-llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.