tony-hong/event-embedding-multitask
*SEM 2018: Learning Distributed Event Representations with a Multi-Task Approach
This project helps natural language processing researchers understand the meaning of events described in text. It takes raw text describing various events and produces numerical representations (embeddings) that capture their semantic properties. A computational linguist or NLP scientist would use this to build better systems for tasks like event prediction or sentiment analysis.
No commits in the last 6 months.
Use this if you are developing NLP models and need to represent events from textual data in a meaningful numerical format for downstream tasks.
Not ideal if you need a pre-trained, ready-to-use solution for general text summarization or question answering.
Stars
21
Forks
12
Language
Python
License
GPL-3.0
Category
Last pushed
Oct 30, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/tony-hong/event-embedding-multitask"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
MilaNLProc/contextualized-topic-models
A python package to run contextualized topic modeling. CTMs combine contextualized embeddings...
vinid/cade
Compass-aligned Distributional Embeddings. Align embeddings from different corpora
spcl/ncc
Neural Code Comprehension: A Learnable Representation of Code Semantics
criteo-research/CausE
Code for the Recsys 2018 paper entitled Causal Embeddings for Recommandation.
vintasoftware/entity-embed
PyTorch library for transforming entities like companies, products, etc. into vectors to support...