PrashantRanjan09/WordEmbeddings-Elmo-Fasttext-Word2Vec
Using pre trained word embeddings (Fasttext, Word2Vec)
This project helps data scientists, machine learning engineers, and NLP practitioners prepare text data for analysis. It takes a corpus of text as input and outputs numerical representations of words (word embeddings) using various state-of-the-art models. This enables more effective machine learning tasks like sentiment analysis or text classification.
158 stars. No commits in the last 6 months.
Use this if you need to convert textual data into a numerical format suitable for machine learning models, and you want flexibility in choosing different word embedding techniques.
Not ideal if you're looking for a complete end-to-end text classification application rather than a tool for generating word embeddings.
Stars
158
Forks
31
Language
Python
License
—
Category
Last pushed
Jun 19, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/PrashantRanjan09/WordEmbeddings-Elmo-Fasttext-Word2Vec"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
CyberZHG/keras-pos-embd
Position embedding layers in Keras
mb-14/embeddings.js
Word embeddings for the web
anishLearnsToCode/word-embeddings
Continuous Bag 💼 of Words Model to create Word embeddings for a word from a given Corpus 📚 and...
neuro-symbolic-ai/multi_relational_hyperbolic_word_embeddings
Multi-Relational Hyperbolic Word Embeddings from Natural Language Definitions
SpringerNLP/Chapter5
Chapter 5: Embeddings