philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
This is a specialized component for developers building neural network models, particularly those working with Keras and TensorFlow. It allows you to integrate 'attention mechanisms' into your models, which help the network focus on the most relevant parts of the input data. Input is typically sequential data like text or time series, and the output helps the model make more informed predictions or classifications. It is used by machine learning engineers or researchers aiming to improve the performance and interpretability of their deep learning models.
2,815 stars.
Use this if you are a machine learning engineer or researcher building a Keras model and want to incorporate attention mechanisms to improve performance on tasks like text classification, sequence processing, or machine translation.
Not ideal if you are looking for a pre-built, end-to-end solution for a specific problem like sentiment analysis or image recognition, as this is a low-level building block.
Stars
2,815
Forks
659
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/philipperemy/keras-attention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is...
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models
soskek/attention_is_all_you_need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.