ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is currently unmaintained, issues will probably not be addressed.
This tool helps organize and categorize large collections of text documents, like customer reviews or articles, by automatically assigning them to predefined categories. You provide it with raw text data, and it outputs classified documents, making it easier to analyze and manage information. It's ideal for data analysts, market researchers, or anyone needing to sort extensive text datasets efficiently.
467 stars. No commits in the last 6 months.
Use this if you need to automatically classify large volumes of text documents into specific categories to streamline analysis or organization.
Not ideal if you need state-of-the-art accuracy, require ongoing support, or are not comfortable with some technical setup, as the project is unmaintained.
Stars
467
Forks
146
Language
Python
License
MIT
Category
Last pushed
May 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ematvey/hierarchical-attention-networks"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models
soskek/attention_is_all_you_need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.