dhruvjverma/NanoLanguageModel
A minimalist, high-performance GPT implementation in PyTorch, optimized for research and training on the TinyStories dataset.
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Jan 08, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dhruvjverma/NanoLanguageModel"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
bahree/helloLondon
Historical Language Model for London - A specialized LLM trained on 1500-1850 historical English text
Chunjiang-Intelligence/Credal-Transformer
论文「Credal Transformer: A Principled Approach for Quantifying and Mitigating Hallucinations in...
MihneaTeodorStoica/mono-lm
Character-level language model focused on training, architecture, and optimization.
imreallyexited/Independent-LLM-Project
PyTorch framework for building and pre-training LLM's.
gurpejsingh13/punjabi-gpt-scratch-20m
Developed and pre-trained a 20.39M-parameter Punjabi GPT-style base model from scratch,...