THUMNLab/AutoAttend

Code Implementation for AutoAttend: Automated Attention Representation Search

35
/ 100
Emerging

This project helps machine learning researchers and practitioners automatically design optimized self-attention models for specific tasks. It takes a dataset like SST (for sentiment analysis) and pre-trained word embeddings, then outputs the best performing self-attention model architecture tailored to that data. This is ideal for those focused on natural language processing tasks who want to achieve high performance without manual model design.

No commits in the last 6 months.

Use this if you are a machine learning researcher or engineer working with natural language processing and need to find the most effective self-attention model for a specific text-based task.

Not ideal if you are looking for a pre-built, off-the-shelf sentiment analysis tool or if your primary focus is not on developing or optimizing attention mechanisms.

Natural Language Processing Sentiment Analysis Machine Learning Research Model Optimization Deep Learning
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

11

Forks

3

Language

Python

License

Apache-2.0

Last pushed

Jul 26, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/THUMNLab/AutoAttend"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.