aitoralmeida/spanish_word2vec
Ready to use Spanish Word2Vec embeddings created from >18B chars and >3B words
This project offers pre-trained word embeddings for the Spanish language, allowing you to understand the relationships and meanings between Spanish words without extensive data processing. It takes a Spanish word as input and provides a numerical representation that captures its semantic context. This is useful for researchers, data scientists, and language model developers working with Spanish text.
No commits in the last 6 months.
Use this if you need to integrate a nuanced understanding of Spanish word meanings into your natural language processing applications, research, or data analysis projects.
Not ideal if your project requires word embeddings for a language other than Spanish or if you need to train highly specialized embeddings on a very specific, niche Spanish corpus.
Stars
44
Forks
5
Language
—
License
—
Category
Last pushed
Jun 22, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/aitoralmeida/spanish_word2vec"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
thunlp/paragraph2vec
Paragraph Vector Implementation
akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...