gmontamat/poor-mans-transformers
Implement Transformers (and Deep Learning) from scratch in NumPy
This project offers a hands-on way to understand how advanced deep learning models, specifically Transformers, are built from the ground up. It lets you create the core components and train a model for tasks like identifying duplicate questions or named entity recognition, using only basic numerical computing tools. This is ideal for deep learning students, researchers, or educators who want to delve into the foundational mechanics of neural networks without relying on high-level frameworks.
No commits in the last 6 months.
Use this if you are a deep learning student or educator who wants to learn the inner workings of Transformer models by implementing them from scratch using only NumPy.
Not ideal if you need to quickly build and deploy high-performance deep learning models for production, as this is a learning tool focused on clarity over efficiency.
Stars
28
Forks
2
Language
Python
License
MIT
Category
Last pushed
Oct 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gmontamat/poor-mans-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action