mcbal/deep-implicit-attention
Implementation of deep implicit attention in PyTorch
This project offers an experimental implementation of 'deep implicit attention' using PyTorch, connecting the transformer architecture to mean-field theory and statistical physics. It allows researchers to explore how attention mechanisms in neural networks can be understood as solving a set of self-consistent equations, much like systems in physics. Researchers in machine learning and theoretical neuroscience can use this to gain deeper insights into how modern AI models process information.
No commits in the last 6 months.
Use this if you are a machine learning researcher or theoretician interested in the underlying mathematical and physical principles of transformer attention mechanisms.
Not ideal if you are looking for a plug-and-play solution for building or training standard transformer models for practical applications.
Stars
65
Forks
5
Language
Python
License
MIT
Category
Last pushed
Aug 02, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mcbal/deep-implicit-attention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action