trekhleb/homemade-gpt-js
A minimal TensorFlow.js re-implementation of Karpathy's minGPT (Generative Pre-trained Transformer). The GPT model itself is <300 lines of code.
This project offers a simple, hands-on way to understand how Generative Pre-trained Transformer (GPT) models work. You can input text data, train a basic GPT model directly in your web browser using your GPU, and then generate new text. It's designed for anyone curious about the inner workings of large language models, from students to AI enthusiasts.
No commits in the last 6 months.
Use this if you want to visually and interactively learn the core concepts behind GPT by training a model and generating text in your browser.
Not ideal if you're looking for a production-ready, high-performance GPT model for complex real-world applications.
Stars
88
Forks
11
Language
TypeScript
License
—
Category
Last pushed
Nov 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/trekhleb/homemade-gpt-js"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...