FareedKhan-dev/create-million-parameter-llm-from-scratch
Building a 2.3M-parameter LLM from scratch with LLaMA 1 architecture.
This project guides machine learning engineers and researchers through the practical steps of building a small-scale Large Language Model (LLM) using the LLaMA 1 architecture. It takes raw text data as input and produces a trained LLM from scratch, detailing the code and implementation without requiring high-end GPUs. This is for individuals who want to understand the nuts and bolts of LLM creation beyond just theory.
201 stars. No commits in the last 6 months.
Use this if you are a machine learning practitioner with a basic understanding of neural networks and Python, eager to implement your own LLM architecture hands-on.
Not ideal if you are looking for a pre-trained, production-ready LLM or a tool for fine-tuning existing large models.
Stars
201
Forks
42
Language
Jupyter Notebook
License
—
Category
Last pushed
May 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/FareedKhan-dev/create-million-parameter-llm-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rasbt/LLMs-from-scratch
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
facebookresearch/LayerSkip
Code for "LayerSkip: Enabling Early Exit Inference and Self-Speculative Decoding", ACL 2024
FareedKhan-dev/train-llm-from-scratch
A straightforward method for training your LLM, from downloading data to generating text.
kmeng01/rome
Locating and editing factual associations in GPT (NeurIPS 2022)
datawhalechina/llms-from-scratch-cn
仅需Python基础,从0构建大语言模型;从0逐步构建GLM4\Llama3\RWKV6, 深入理解大模型原理