Awni00/abstract_transformer
This is the project repo associated with the paper "Disentangling and Integrating Relational and Sensory Information in Transformer Architectures" by Awni Altabaa, John Lafferty
This project offers a new way to build AI models that can better understand sequences of information, like text, images, or data points. It takes in raw sequence data and produces a more sophisticated model that can process both individual object features and the relationships between those objects. It's designed for machine learning researchers and practitioners who build advanced AI systems for tasks like language understanding or image recognition.
Use this if you are building Transformer-based AI models and need them to more explicitly capture and leverage the relationships between items in a sequence, in addition to their individual characteristics.
Not ideal if you are looking for an off-the-shelf application to solve a specific problem, as this is a foundational architectural enhancement rather than an end-user tool.
Stars
6
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 21, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Awni00/abstract_transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...