mkaufma90/word2vec-deps
Dependency based word embeddings
This project helps natural language processing researchers and students explore an alternative way to generate word embeddings. It takes text data and processes it using dependency parses to create word vectors, offering a different perspective on semantic relationships compared to standard word2vec. It is ideal for those studying advanced NLP techniques.
No commits in the last 6 months.
Use this if you are a researcher or student interested in experimenting with dependency-based word embeddings for academic exploration rather than production applications.
Not ideal if you need a production-ready solution for generating word embeddings or require robust performance and scalability.
Stars
12
Forks
1
Language
Jupyter Notebook
License
—
Category
Last pushed
Jun 19, 2017
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/mkaufma90/word2vec-deps"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
thunlp/paragraph2vec
Paragraph Vector Implementation
akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...