adi-mish/miniformer
Miniformer is a lightweight PyTorch transformer library for researchers, educators, and tinkerers. It supports encoder-only and encoder–decoder models with multi-head attention and rotary embeddings. The codebase is small and readable, ideal for prototyping or edge deployment.
No commits in the last 6 months.
Stars
—
Forks
—
Language
Python
License
GPL-3.0
Category
Last pushed
Aug 03, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/adi-mish/miniformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi