BobMcDear/attention-in-vision
PyTorch implementation of popular attention mechanisms in vision
This is a collection of essential building blocks for deep learning models that process images. It allows researchers and machine learning engineers to easily incorporate and test various 'attention' mechanisms, which help models focus on the most relevant parts of an image. You provide an image processing model and this project offers different attention modules to enhance its performance.
No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer building or experimenting with image recognition, object detection, or other computer vision models and want to improve their accuracy or efficiency.
Not ideal if you are a practitioner looking for a ready-to-use application for image analysis without needing to customize neural network architectures.
Stars
19
Forks
2
Language
Python
License
—
Category
Last pushed
Sep 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/BobMcDear/attention-in-vision"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is...
thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models