keras-attention and attention_keras

keras-attention
51
Established
attention_keras
51
Established
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Stars: 750
Forks: 245
Downloads:
Commits (30d): 0
Language: Python
License: AGPL-3.0
Stars: 444
Forks: 266
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About keras-attention

datalogue/keras-attention

Visualizing RNNs using the attention mechanism

This helps developers understand how a neural network processes and translates dates by visualizing its 'attention.' You input various human-readable date formats, and the system shows how the network focuses on different parts of the input to produce a standardized 'machine-readable' date. It's designed for machine learning engineers or researchers experimenting with neural network architectures for sequence-to-sequence tasks.

neural-networks attention-mechanism sequence-to-sequence RNN-visualization machine-learning-research

About attention_keras

thushv89/attention_keras

Keras Layer implementation of Attention for Sequential models

This project offers a specialized component for building advanced neural network models, specifically for tasks involving sequences like language translation. It takes the output of recurrent neural networks (RNNs) as input and processes them to generate a more focused output, which can then be used to improve the accuracy of sequence prediction. This is used by machine learning engineers or researchers who are building and fine-tuning deep learning models for sequence-to-sequence problems.

natural-language-processing sequence-modeling deep-learning-architecture machine-translation neural-networks

Scores updated daily from GitHub, PyPI, and npm data. How scores work