roomylee/self-attentive-emb-tf
Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)
This project helps developers and researchers working with natural language understand and categorize text. It takes raw text sentences as input and produces a concise numerical representation (embedding) that captures the meaning, along with a visualization showing which words the model paid most attention to. This is ideal for those building systems that need to process and classify large volumes of text, such as news articles or social media posts.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher who needs to transform text into meaningful numerical data for tasks like topic classification, and you want to understand which parts of a sentence contribute most to its meaning.
Not ideal if you are looking for a pre-built, production-ready application for immediate text analysis without diving into model training and evaluation.
Stars
90
Forks
32
Language
Python
License
MIT
Category
Last pushed
Jun 25, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/roomylee/self-attentive-emb-tf"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...