Riccorl/sense-embedding
BabelNet (and WordNet) sense embedding trained with Word2Vec and FastText
This tool helps you understand the specific meanings of words in text by generating 'sense embeddings.' It takes large text datasets, like multilingual corpora, and produces numerical representations for each distinct meaning (sense) of a word, not just the word itself. Language researchers, computational linguists, or AI developers working with natural language processing can use this to differentiate between homonyms or polysemous words.
No commits in the last 6 months.
Use this if you need to capture the nuanced meanings of words in text, especially for tasks where distinguishing between different senses of the same word is crucial for accurate analysis.
Not ideal if you only need general word-level representations and are not concerned with the specific, disambiguated meanings of words.
Stars
10
Forks
1
Language
Python
License
—
Category
Last pushed
Sep 03, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/Riccorl/sense-embedding"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
MilaNLProc/contextualized-topic-models
A python package to run contextualized topic modeling. CTMs combine contextualized embeddings...
vinid/cade
Compass-aligned Distributional Embeddings. Align embeddings from different corpora
spcl/ncc
Neural Code Comprehension: A Learnable Representation of Code Semantics
criteo-research/CausE
Code for the Recsys 2018 paper entitled Causal Embeddings for Recommandation.
vintasoftware/entity-embed
PyTorch library for transforming entities like companies, products, etc. into vectors to support...