lengyueit/gpt-mini
OpenAI GPT的简单复现
This project helps developers and researchers experiment with a foundational generative AI model. It provides a simple re-implementation of the GPT-1 architecture, allowing you to input text data (like multi-turn conversations) and train a model that can then generate conversational text. It is ideal for machine learning engineers, AI researchers, or students looking to understand and build basic large language models.
No commits in the last 6 months.
Use this if you want to train and experiment with a basic conversational AI model from scratch using your own text data.
Not ideal if you need a high-quality, production-ready conversational AI or if you are not comfortable with machine learning model training and development.
Stars
20
Forks
2
Language
Python
License
—
Category
Last pushed
Nov 29, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/lengyueit/gpt-mini"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
LowinLi/transformers-stream-generator
This is a text generation method which returns a generator, streaming out each token in...
ystemsrx/mini-nanoGPT
One-click training of your own GPT. Training a GPT has never been easier for beginners. /...
jaymody/picoGPT
An unnecessarily tiny implementation of GPT-2 in NumPy.
kyegomez/AttentionGrid
A network of attention mechanisms at your fingertips. Unleash the potential of attention...
kamalkraj/minGPT-TF
A minimal TF2 re-implementation of the OpenAI GPT training