YannDubs/Hash-Embeddings
PyTorch implementation of Hash Embeddings (NIPS 2017). Submission to the NIPS Implementation Challenge.
This project helps machine learning practitioners efficiently process and classify text data, especially when dealing with large vocabularies. It takes text inputs and, using a technique called hash embeddings, converts them into a numerical representation that can then be used for classification tasks. This is ideal for data scientists or NLP engineers who need to build high-performing text classification models.
206 stars. No commits in the last 6 months.
Use this if you need to build efficient Natural Language Processing models for text classification, especially when working with large datasets and wanting to reduce the memory footprint of word embeddings.
Not ideal if you need a pre-trained model for immediate use or are not comfortable with implementing and evaluating custom embedding layers in a deep learning framework.
Stars
206
Forks
30
Language
Python
License
MIT
Category
Last pushed
Nov 12, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/YannDubs/Hash-Embeddings"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cosmosgl/graph
GPU-accelerated force graph layout and rendering
Clay-foundation/model
The Clay Foundation Model - An open source AI model and interface for Earth
nomic-ai/nomic
Nomic Developer API SDK
omoindrot/tensorflow-triplet-loss
Implementation of triplet loss in TensorFlow
sashakolpakov/dire-jax
DImensionality REduction in JAX