THUMNLab/AutoAttend
Code Implementation for AutoAttend: Automated Attention Representation Search
This project helps machine learning researchers and practitioners automatically design optimized self-attention models for specific tasks. It takes a dataset like SST (for sentiment analysis) and pre-trained word embeddings, then outputs the best performing self-attention model architecture tailored to that data. This is ideal for those focused on natural language processing tasks who want to achieve high performance without manual model design.
No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer working with natural language processing and need to find the most effective self-attention model for a specific text-based task.
Not ideal if you are looking for a pre-built, off-the-shelf sentiment analysis tool or if your primary focus is not on developing or optimizing attention mechanisms.
Stars
11
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Jul 26, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/THUMNLab/AutoAttend"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...