AIDajiangtang/LLM-from-scratch

从零开始学大模型Transformer、GPT2、BERT pre-training and fine-tuning from scratch

21
/ 100
Experimental

This project offers detailed notebooks and guides for understanding and implementing Large Language Models (LLMs) like Transformer, GPT-2, and BERT from their foundational elements. It covers pre-training, fine-tuning, and practical applications such as text classification, sentiment analysis, and building chatbots. AI developers and researchers would use this to gain hands-on experience and build custom LLM solutions.

No commits in the last 6 months.

Use this if you are an AI developer or researcher who wants to learn the mechanics of LLMs, from basic components to advanced applications, through practical code examples and clear explanations.

Not ideal if you are looking for a pre-built, ready-to-deploy LLM solution without diving into the underlying code and training processes.

AI Development Natural Language Processing Machine Learning Engineering Language Model Customization
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 6 / 25

How are scores calculated?

Stars

37

Forks

2

Language

Jupyter Notebook

License

Last pushed

Jul 01, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AIDajiangtang/LLM-from-scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.