minGPT-TF and minGPTF

These two tools are competitors, as both are minimal TensorFlow re-implementations of the GPT training process, with one specifically referencing Karpathy's `minGPT`.

minGPT-TF
43
Emerging
minGPTF
33
Emerging
Maintenance 0/25
Adoption 8/25
Maturity 16/25
Community 19/25
Maintenance 0/25
Adoption 4/25
Maturity 16/25
Community 13/25
Stars: 58
Forks: 18
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stars: 8
Forks: 2
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About minGPT-TF

kamalkraj/minGPT-TF

A minimal TF2 re-implementation of the OpenAI GPT training

This project helps machine learning practitioners or researchers understand and implement the core components of GPT-like models using TensorFlow. It takes a sequence of numerical tokens (representing text or other discrete data) and outputs a probability distribution for the next token in the sequence. Data scientists, AI researchers, or students learning about generative models would find this useful for experimenting with foundational transformer architectures.

natural-language-generation text-prediction transformer-models sequence-modeling machine-learning-research

About minGPTF

akanyaani/minGPTF

A TF re-implementation of the Karpathy's minGPT (Generative Pretrained Transformer) training

This project helps machine learning engineers and researchers implement and experiment with Generative Pretrained Transformer (GPT) models using TensorFlow. It takes raw text data as input and produces a trained language model that can generate new, coherent text based on learned patterns. The primary users are developers and ML practitioners who are familiar with deep learning frameworks and the architecture of large language models.

natural-language-processing deep-learning text-generation language-modeling machine-learning-engineering

Scores updated daily from GitHub, PyPI, and npm data. How scores work