keya-desai/Gated-Attention

Implementation of the paper : Not all attention is needed - Gated Attention Network for Sequence Data (GA-Net) [https://arxiv.org/abs/1912.00349]

29
/ 100
Experimental

This project helps machine learning engineers and researchers improve the efficiency and interpretability of their sequence data models. It takes raw sequence data and processes it using a 'gated attention' mechanism, which identifies and focuses only on the most critical parts of the sequence. The output is a more precise and computationally lighter attention mechanism for tasks like natural language processing or time series analysis.

No commits in the last 6 months.

Use this if you are building sequence-based models and need a more efficient way for your model to focus on the most relevant parts of the input, rather than attending to everything.

Not ideal if your primary concern is traditional 'soft attention' where every input token should contribute to the attention output, even if minimally.

machine-learning-engineering natural-language-processing time-series-analysis deep-learning-research model-optimization
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 16 / 25

How are scores calculated?

Stars

13

Forks

6

Language

Jupyter Notebook

License

Last pushed

Aug 20, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/keya-desai/Gated-Attention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.