thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models
This project offers a specialized component for building advanced neural network models, specifically for tasks involving sequences like language translation. It takes the output of recurrent neural networks (RNNs) as input and processes them to generate a more focused output, which can then be used to improve the accuracy of sequence prediction. This is used by machine learning engineers or researchers who are building and fine-tuning deep learning models for sequence-to-sequence problems.
444 stars. No commits in the last 6 months.
Use this if you are a machine learning engineer building a Keras-based recurrent neural network (RNN) model and need to incorporate a Bahdanau Attention mechanism to improve its performance on sequence tasks.
Not ideal if you are looking for a complete, out-of-the-box solution for a specific application or if you require attention mechanisms other than Bahdanau Attention.
Stars
444
Forks
266
Language
Python
License
MIT
Category
Last pushed
Mar 25, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thushv89/attention_keras"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is...
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
soskek/attention_is_all_you_need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.