philipperemy/keras-attention

Keras Attention Layer (Luong and Bahdanau scores).

61
/ 100
Established

This is a specialized component for developers building neural network models, particularly those working with Keras and TensorFlow. It allows you to integrate 'attention mechanisms' into your models, which help the network focus on the most relevant parts of the input data. Input is typically sequential data like text or time series, and the output helps the model make more informed predictions or classifications. It is used by machine learning engineers or researchers aiming to improve the performance and interpretability of their deep learning models.

2,815 stars.

Use this if you are a machine learning engineer or researcher building a Keras model and want to incorporate attention mechanisms to improve performance on tasks like text classification, sequence processing, or machine translation.

Not ideal if you are looking for a pre-built, end-to-end solution for a specific problem like sentiment analysis or image recognition, as this is a low-level building block.

deep-learning-development neural-network-architecture natural-language-processing sequence-modeling model-optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

2,815

Forks

659

Language

Python

License

Apache-2.0

Last pushed

Mar 12, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/philipperemy/keras-attention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.