TatevKaren/BabyGPT-Build_GPT_From_Scratch
BabyGPT: Build Your Own GPT Large Language Model from Scratch Pre-Training Generative Transformer Models: Building GPT from Scratch with a Step-by-Step Guide to Generative AI in PyTorch and Python
This project provides a step-by-step guide to building a generative AI model, similar to GPT, from scratch. It helps you understand how raw text data, like Shakespeare's works, is processed and transformed into a model that can generate new, coherent text. The ideal end-user is a student or enthusiast of artificial intelligence who wants to learn the inner workings of large language models by actively constructing one.
116 stars. No commits in the last 6 months.
Use this if you want to learn the fundamental components and architectural choices that make up modern generative AI language models.
Not ideal if you're looking for a ready-to-use tool to implement or fine-tune an existing large language model for a specific business task.
Stars
116
Forks
35
Language
Python
License
—
Category
Last pushed
Dec 05, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/TatevKaren/BabyGPT-Build_GPT_From_Scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...