nathanrooy/word2vec-from-scratch-with-python

A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python

46
/ 100
Emerging

This project helps natural language processing (NLP) enthusiasts and students understand how word embeddings are created. You put in a text corpus, and it shows you how word relationships can be represented numerically. It's ideal for anyone learning the foundational concepts behind modern NLP techniques.

101 stars. No commits in the last 6 months.

Use this if you want to understand the core mechanics of how algorithms like Word2Vec learn word relationships from text data.

Not ideal if you need a fast, production-ready tool to generate word embeddings for a real-world application.

natural-language-processing computational-linguistics text-analytics machine-learning-education data-science-fundamentals
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

101

Forks

45

Language

Python

License

MIT

Last pushed

Feb 11, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/nathanrooy/word2vec-from-scratch-with-python"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.