cr0wley-zz/Embeddings

A study on the ingenious concept of word2vec. The repository contains a detailed code of the CBOW and Skip-gram architectures.

27
/ 100
Experimental

This project helps machine learning engineers and natural language processing (NLP) researchers understand how to represent words as numerical vectors, a fundamental step for many NLP applications. It provides detailed code examples for different word embedding techniques like Word2Vec (CBOW and Skip-gram) and GloVe. You can input raw text data and generate trained word embedding models.

No commits in the last 6 months.

Use this if you are an NLP practitioner who wants to deeply understand the mechanics of word embeddings, including the underlying math and practical implementation.

Not ideal if you are looking for a ready-to-use library to apply pre-trained word embeddings without needing to understand their internal workings.

natural-language-processing machine-learning-engineering word-vectors text-representation deep-learning-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

7

Forks

4

Language

Jupyter Notebook

License

Last pushed

Feb 11, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/cr0wley-zz/Embeddings"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.