microGPT and micro-gpt-and-beyond
The two tools are competitors, with danilop/micro-gpt-and-beyond offering a broader range of implementation frameworks for a similar "tiny GPT" concept, potentially serving as an alternative or more comprehensive learning resource compared to LeeSinLiang/microGPT's single, lightweight Python implementation.
About microGPT
LeeSinLiang/microGPT
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
This project helps AI engineers and machine learning researchers understand and build Generative Pre-trained Transformer (GPT) models from the ground up. You provide a dataset of text, and it produces a small, custom GPT model capable of generating new text based on its training. It's ideal for those looking to learn the internals of generative AI and experiment with their own custom language models on consumer-grade hardware.
About micro-gpt-and-beyond
danilop/micro-gpt-and-beyond
Six implementations of the same tiny GPT language model — from pure Python to PyTorch, JAX, and MLX. Inspired by Andrej Karpathy's microGPT.
Scores updated daily from GitHub, PyPI, and npm data. How scores work