naokishibuya/simple_transformer
A Transformer Implementation that is easy to understand and customizable.
This project offers a foundational implementation of the Transformer architecture, a key component in advanced language models. It takes raw text data as input, processes it, and outputs trained models for tasks like machine translation. This tool is for machine learning engineers, AI researchers, and students who are building or deeply studying natural language processing systems.
No commits in the last 6 months.
Use this if you need to understand, customize, or build upon the core Transformer architecture for text-based AI applications.
Not ideal if you're looking for a plug-and-play solution for general text generation or analysis without needing to delve into the model's internals.
Stars
11
Forks
6
Language
Python
License
MIT
Category
Last pushed
Aug 06, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/naokishibuya/simple_transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action