nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
This project helps natural language processing (NLP) enthusiasts and students understand how word embeddings are created. You put in a text corpus, and it shows you how word relationships can be represented numerically. It's ideal for anyone learning the foundational concepts behind modern NLP techniques.
101 stars. No commits in the last 6 months.
Use this if you want to understand the core mechanics of how algorithms like Word2Vec learn word relationships from text data.
Not ideal if you need a fast, production-ready tool to generate word embeddings for a real-world application.
Stars
101
Forks
45
Language
Python
License
MIT
Category
Last pushed
Feb 11, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/nathanrooy/word2vec-from-scratch-with-python"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
thunlp/paragraph2vec
Paragraph Vector Implementation
akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...
YuyuZha0/word2vec
a word2vec impl of Chinese language, based on deeplearning4j and ansj