Tongjilibo/build_MiniLLM_from_scratch
从0到1构建一个MiniLLM (pretrain+sft+dpo实践中)
This project helps machine learning engineers and researchers build small-scale, custom large language models (LLMs) from the ground up. You provide a dataset for pre-training and instruction-tuning, and the project outputs a specialized chat model capable of simple conversational tasks. This is ideal for those who need to experiment with LLM architectures or develop domain-specific models with controlled computational resources.
537 stars. No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to deeply understand and customize the entire lifecycle of building a small large language model, from initial pre-training to instruction-tuning for conversational abilities.
Not ideal if you need a production-ready, highly complex LLM capable of answering intricate questions, or if you prefer using pre-built, general-purpose models for immediate application.
Stars
537
Forks
59
Language
Python
License
MIT
Category
Last pushed
Mar 23, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Tongjilibo/build_MiniLLM_from_scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소