microGPT and micro-gpt-and-beyond

The two tools are competitors, with danilop/micro-gpt-and-beyond offering a broader range of implementation frameworks for a similar "tiny GPT" concept, potentially serving as an alternative or more comprehensive learning resource compared to LeeSinLiang/microGPT's single, lightweight Python implementation.

microGPT
46
Emerging
micro-gpt-and-beyond
36
Emerging
Maintenance 6/25
Adoption 9/25
Maturity 16/25
Community 15/25
Maintenance 10/25
Adoption 3/25
Maturity 11/25
Community 12/25
Stars: 113
Forks: 16
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 4
Forks: 1
Downloads:
Commits (30d): 0
Language: Python
License:
No Package No Dependents
No Package No Dependents

About microGPT

LeeSinLiang/microGPT

Implementation of GPT from scratch. Design to be lightweight and easy to modify.

This project helps AI engineers and machine learning researchers understand and build Generative Pre-trained Transformer (GPT) models from the ground up. You provide a dataset of text, and it produces a small, custom GPT model capable of generating new text based on its training. It's ideal for those looking to learn the internals of generative AI and experiment with their own custom language models on consumer-grade hardware.

generative-AI machine-learning-research natural-language-processing model-training AI-education

About micro-gpt-and-beyond

danilop/micro-gpt-and-beyond

Six implementations of the same tiny GPT language model — from pure Python to PyTorch, JAX, and MLX. Inspired by Andrej Karpathy's microGPT.

Scores updated daily from GitHub, PyPI, and npm data. How scores work