rozester/Arabic-Word-Embeddings-Word2vec
Arabic Word Embeddings Word2vec
This helps computational linguists and data scientists working with Arabic text. It provides pre-trained word embeddings for the Arabic language, allowing you to represent words as numerical vectors. This input takes Arabic text and outputs numerical representations of those words, enabling machines to understand semantic relationships between them. You would use this if you're building applications that process or understand Arabic language.
No commits in the last 6 months.
Use this if you need pre-calculated, numerical representations of Arabic words to analyze text or power language-aware applications.
Not ideal if you need word embeddings for a language other than Arabic, or if you prefer to train your own embeddings from scratch.
Stars
27
Forks
9
Language
JavaScript
License
—
Category
Last pushed
Dec 13, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/rozester/Arabic-Word-Embeddings-Word2vec"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nathanrooy/word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
Planeshifter/node-word2vec
Node.js interface to the Google word2vec tool.
thunlp/paragraph2vec
Paragraph Vector Implementation
akoksal/Turkish-Word2Vec
Pre-trained Word2Vec Model for Turkish
RichDavis1/PHPW2V
A PHP implementation of Word2Vec, a popular word embedding algorithm created by Tomas Mikolov...