minGPT-TF and minGPTF
These two tools are competitors, as both are minimal TensorFlow re-implementations of the GPT training process, with one specifically referencing Karpathy's `minGPT`.
About minGPT-TF
kamalkraj/minGPT-TF
A minimal TF2 re-implementation of the OpenAI GPT training
This project helps machine learning practitioners or researchers understand and implement the core components of GPT-like models using TensorFlow. It takes a sequence of numerical tokens (representing text or other discrete data) and outputs a probability distribution for the next token in the sequence. Data scientists, AI researchers, or students learning about generative models would find this useful for experimenting with foundational transformer architectures.
About minGPTF
akanyaani/minGPTF
A TF re-implementation of the Karpathy's minGPT (Generative Pretrained Transformer) training
This project helps machine learning engineers and researchers implement and experiment with Generative Pretrained Transformer (GPT) models using TensorFlow. It takes raw text data as input and produces a trained language model that can generate new, coherent text based on learned patterns. The primary users are developers and ML practitioners who are familiar with deep learning frameworks and the architecture of large language models.
Scores updated daily from GitHub, PyPI, and npm data. How scores work