MDalamin5/Build-and-Finetune-LLM-From-Scratch-Deploy-via-vLLM-AWS-GCP
A complete end-to-end learning repo covering everything from building Large Language Models (LLMs) from scratch to mastering practical deep learning with PyTorch. Includes tokenizer coding, transformers, attention, training loops, model finetuning, and hands-on PyTorch projects.
Stars
—
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 15, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/MDalamin5/Build-and-Finetune-LLM-From-Scratch-Deploy-via-vLLM-AWS-GCP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소