pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
This project is an archived, basic implementation of a transformer neural network, useful for understanding the foundational architecture. It takes in sequential data, such as text, and processes it to produce another sequence or a representation, demonstrating how transformer models learn relationships within data. It's intended for students or researchers who want to delve into the core mechanics of transformer models from a computational perspective.
1,092 stars. No commits in the last 6 months.
Use this if you are a student or researcher wanting to learn the fundamental, low-level implementation of a transformer architecture from scratch.
Not ideal if you need a production-ready, high-performance, or actively maintained transformer library for real-world applications.
Stars
1,092
Forks
172
Language
Python
License
MIT
Category
Last pushed
Mar 20, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/pbloem/former"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...
ARM-software/keyword-transformer
Official implementation of the Keyword Transformer: https://arxiv.org/abs/2104.00769