MiuLab/GenDef
Probing task; contextual embeddings -> textual definitions (EMNLP19)
This project helps researchers and developers working with advanced natural language processing models to understand what these models 'think' a word means in a specific sentence. You provide a word's numerical representation from models like BERT or ELMo within its context, and it outputs a human-readable textual definition. This is useful for computational linguists and AI researchers.
No commits in the last 6 months.
Use this if you need to interpret the meaning captured by contextualized word embeddings from models like BERT or ELMo.
Not ideal if you are looking for a general-purpose dictionary or a tool to define words in everyday language for a broad audience.
Stars
11
Forks
3
Language
Python
License
—
Category
Last pushed
Apr 22, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/MiuLab/GenDef"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
MilaNLProc/contextualized-topic-models
A python package to run contextualized topic modeling. CTMs combine contextualized embeddings...
vinid/cade
Compass-aligned Distributional Embeddings. Align embeddings from different corpora
spcl/ncc
Neural Code Comprehension: A Learnable Representation of Code Semantics
criteo-research/CausE
Code for the Recsys 2018 paper entitled Causal Embeddings for Recommandation.
vintasoftware/entity-embed
PyTorch library for transforming entities like companies, products, etc. into vectors to support...