antonyvigouret/Pay-Attention-to-MLPs
My implementation of the gMLP model from the paper "Pay Attention to MLPs".
This project helps machine learning practitioners build models for tasks like natural language processing or image recognition without the computational overhead of traditional attention mechanisms. You feed it your dataset (text, images, etc.) and it outputs a trained model that performs just as well as more complex Transformer models. This is ideal for researchers or ML engineers looking for efficient, high-performing models.
No commits in the last 6 months.
Use this if you need to train powerful deep learning models for language or vision tasks but want to avoid the complexity and computational cost of Transformer architectures.
Not ideal if your primary goal is interpretability of attention weights or if you are already heavily invested in a Transformer-based ecosystem.
Stars
25
Forks
4
Language
Python
License
MIT
Category
Last pushed
May 25, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/antonyvigouret/Pay-Attention-to-MLPs"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action