kaushalshetty/Positional-Encoding
Encoding position with the word embeddings.
This tool helps AI model developers enhance how their models understand the order of words in a sentence, especially for tasks like language translation or text summarization. It takes in sequences of words and their initial numerical representations (word embeddings) and outputs enhanced representations that incorporate each word's position, ready for processing by attention layers. This is ideal for machine learning engineers and researchers building transformer-based language models.
No commits in the last 6 months.
Use this if you are developing natural language processing models and want to leverage positional encoding to capture word order without using recurrent neural networks.
Not ideal if you are looking for a high-level, off-the-shelf NLP solution or are not comfortable working with deep learning model components at the code level.
Stars
84
Forks
13
Language
Jupyter Notebook
License
—
Category
Last pushed
May 17, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/kaushalshetty/Positional-Encoding"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
MilaNLProc/contextualized-topic-models
A python package to run contextualized topic modeling. CTMs combine contextualized embeddings...
vinid/cade
Compass-aligned Distributional Embeddings. Align embeddings from different corpora
spcl/ncc
Neural Code Comprehension: A Learnable Representation of Code Semantics
criteo-research/CausE
Code for the Recsys 2018 paper entitled Causal Embeddings for Recommandation.
vintasoftware/entity-embed
PyTorch library for transforming entities like companies, products, etc. into vectors to support...