roomylee/self-attentive-emb-tf

Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)

45
/ 100
Emerging

This project helps developers and researchers working with natural language understand and categorize text. It takes raw text sentences as input and produces a concise numerical representation (embedding) that captures the meaning, along with a visualization showing which words the model paid most attention to. This is ideal for those building systems that need to process and classify large volumes of text, such as news articles or social media posts.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who needs to transform text into meaningful numerical data for tasks like topic classification, and you want to understand which parts of a sentence contribute most to its meaning.

Not ideal if you are looking for a pre-built, production-ready application for immediate text analysis without diving into model training and evaluation.

Natural Language Processing Text Classification Machine Learning Research Sentence Representation Neural Networks
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

90

Forks

32

Language

Python

License

MIT

Last pushed

Jun 25, 2018

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/roomylee/self-attentive-emb-tf"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.