naokishibuya/simple_transformer

A Transformer Implementation that is easy to understand and customizable.

39
/ 100
Emerging

This project offers a foundational implementation of the Transformer architecture, a key component in advanced language models. It takes raw text data as input, processes it, and outputs trained models for tasks like machine translation. This tool is for machine learning engineers, AI researchers, and students who are building or deeply studying natural language processing systems.

No commits in the last 6 months.

Use this if you need to understand, customize, or build upon the core Transformer architecture for text-based AI applications.

Not ideal if you're looking for a plug-and-play solution for general text generation or analysis without needing to delve into the model's internals.

natural-language-processing machine-translation AI-research deep-learning-engineering text-modeling
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

11

Forks

6

Language

Python

License

MIT

Last pushed

Aug 06, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/naokishibuya/simple_transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.