pat-coady/word2vec
Learning Word Vectors from Project Gutenberg Texts
This project helps you understand the relationships between words by analyzing large collections of text, like books. You feed in raw text documents, and it outputs numerical representations of words that capture their meaning and context. This is useful for researchers, linguists, or anyone interested in computational linguistics and text analysis.
No commits in the last 6 months.
Use this if you want to find words with similar meanings, predict analogies, or visualize word relationships based on how they appear in a document corpus.
Not ideal if you need a pre-trained, ready-to-use solution for general text understanding without custom training.
Stars
8
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Aug 17, 2017
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/pat-coady/word2vec"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
thunlp/paragraph2vec
Paragraph Vector Implementation
akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...