Tongjilibo/build_MiniLLM_from_scratch

从0到1构建一个MiniLLM (pretrain+sft+dpo实践中)

43
/ 100
Emerging

This project helps machine learning engineers and researchers build small-scale, custom large language models (LLMs) from the ground up. You provide a dataset for pre-training and instruction-tuning, and the project outputs a specialized chat model capable of simple conversational tasks. This is ideal for those who need to experiment with LLM architectures or develop domain-specific models with controlled computational resources.

537 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking to deeply understand and customize the entire lifecycle of building a small large language model, from initial pre-training to instruction-tuning for conversational abilities.

Not ideal if you need a production-ready, highly complex LLM capable of answering intricate questions, or if you prefer using pre-built, general-purpose models for immediate application.

natural-language-processing conversational-ai machine-learning-engineering model-training computational-linguistics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

537

Forks

59

Language

Python

License

MIT

Last pushed

Mar 23, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Tongjilibo/build_MiniLLM_from_scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.