thushv89/attention_keras

Keras Layer implementation of Attention for Sequential models

51
/ 100
Established

This project offers a specialized component for building advanced neural network models, specifically for tasks involving sequences like language translation. It takes the output of recurrent neural networks (RNNs) as input and processes them to generate a more focused output, which can then be used to improve the accuracy of sequence prediction. This is used by machine learning engineers or researchers who are building and fine-tuning deep learning models for sequence-to-sequence problems.

444 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer building a Keras-based recurrent neural network (RNN) model and need to incorporate a Bahdanau Attention mechanism to improve its performance on sequence tasks.

Not ideal if you are looking for a complete, out-of-the-box solution for a specific application or if you require attention mechanisms other than Bahdanau Attention.

natural-language-processing sequence-modeling deep-learning-architecture machine-translation neural-networks
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

444

Forks

266

Language

Python

License

MIT

Last pushed

Mar 25, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thushv89/attention_keras"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.