BotCenter/spanishWordEmbeddings
Spanish Word Embeddings computed from large corpora and different sizes using fastText.
This project provides pre-computed Spanish word embeddings, which are numerical representations of Spanish words that capture their meaning and relationships. You input individual Spanish words, and the embeddings provide a way to understand semantic similarities between them. This resource is valuable for anyone working with Spanish text who needs to power applications like sentiment analysis, topic modeling, or translation.
No commits in the last 6 months.
Use this if you need to understand the meaning and context of Spanish words programmatically for natural language processing tasks.
Not ideal if your project requires word embeddings for a language other than Spanish or needs to process non-textual data.
Stars
9
Forks
1
Language
—
License
MIT
Category
Last pushed
Jul 19, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/BotCenter/spanishWordEmbeddings"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
shibing624/text2vec
text2vec, text to vector....
predict-idlab/pyRDF2Vec
đ Python Implementation and Extension of RDF2Vec
IntuitionEngineeringTeam/chars2vec
Character-based word embeddings model based on RNN for handling real world texts
IITH-Compilers/IR2Vec
Implementation of IR2Vec, LLVM IR Based Scalable Program Embeddings
ddangelov/Top2Vec
Top2Vec learns jointly embedded topic, document and word vectors.