FareedKhan-dev/create-million-parameter-llm-from-scratch

Building a 2.3M-parameter LLM from scratch with LLaMA 1 architecture.

39
/ 100
Emerging

This project guides machine learning engineers and researchers through the practical steps of building a small-scale Large Language Model (LLM) using the LLaMA 1 architecture. It takes raw text data as input and produces a trained LLM from scratch, detailing the code and implementation without requiring high-end GPUs. This is for individuals who want to understand the nuts and bolts of LLM creation beyond just theory.

201 stars. No commits in the last 6 months.

Use this if you are a machine learning practitioner with a basic understanding of neural networks and Python, eager to implement your own LLM architecture hands-on.

Not ideal if you are looking for a pre-trained, production-ready LLM or a tool for fine-tuning existing large models.

natural-language-processing deep-learning-engineering language-model-development machine-learning-research ai-model-building
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 21 / 25

How are scores calculated?

Stars

201

Forks

42

Language

Jupyter Notebook

License

Last pushed

May 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/FareedKhan-dev/create-million-parameter-llm-from-scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.