HySonLab/HierAttention

Scalable Hierarchical Self-Attention with Learnable Hierarchy for Long-Range Interactions

20
/ 100
Experimental

This project helps machine learning researchers who are developing models that process very long sequences of data, such as extensive text documents or complex biological sequences. It provides a method to efficiently identify important relationships across distant parts of the sequence, overcoming computational limitations of standard attention mechanisms. The input is long sequential data, and the output is a more computationally efficient and performant attention model.

No commits in the last 6 months.

Use this if you are a machine learning researcher working on models for very long sequences and need a more efficient way to capture long-range dependencies.

Not ideal if you are not a machine learning researcher or your primary interest is in applying existing, off-the-shelf models to short sequences.

deep-learning sequence-modeling attention-mechanisms computational-efficiency model-architecture
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

8

Forks

1

Language

Python

License

Last pushed

Apr 24, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/HySonLab/HierAttention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.