akanyaani/minGPTF
A TF re-implementation of the Karpathy's minGPT (Generative Pretrained Transformer) training
This project helps machine learning engineers and researchers implement and experiment with Generative Pretrained Transformer (GPT) models using TensorFlow. It takes raw text data as input and produces a trained language model that can generate new, coherent text based on learned patterns. The primary users are developers and ML practitioners who are familiar with deep learning frameworks and the architecture of large language models.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to train or fine-tune GPT-like models using the TensorFlow framework.
Not ideal if you are an end-user without programming skills who just wants to use a pre-trained text generation tool.
Stars
8
Forks
2
Language
Python
License
MIT
Category
Last pushed
Jan 10, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/akanyaani/minGPTF"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
LowinLi/transformers-stream-generator
This is a text generation method which returns a generator, streaming out each token in...
ystemsrx/mini-nanoGPT
One-click training of your own GPT. Training a GPT has never been easier for beginners. /...
jaymody/picoGPT
An unnecessarily tiny implementation of GPT-2 in NumPy.
kyegomez/AttentionGrid
A network of attention mechanisms at your fingertips. Unleash the potential of attention...
kamalkraj/minGPT-TF
A minimal TF2 re-implementation of the OpenAI GPT training