asappresearch/sru
Training RNNs as Fast as CNNs (https://arxiv.org/abs/1709.02755)
This is a recurrent neural network component designed to speed up the training of natural language processing (NLP) models. It takes sequences of data, like text or speech embeddings, as input and processes them much faster than traditional LSTM networks while maintaining accuracy. It is ideal for machine learning engineers and researchers who are building and training NLP models.
2,112 stars. No commits in the last 6 months.
Use this if you need to train large NLP models more quickly and efficiently, especially when dealing with sequential data.
Not ideal if your primary task does not involve sequential data processing or if you are not working with deep learning models.
Stars
2,112
Forks
305
Language
Python
License
MIT
Category
Last pushed
Jan 04, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/asappresearch/sru"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kk7nc/RMDL
RMDL: Random Multimodel Deep Learning for Classification
MaximeVandegar/Papers-in-100-Lines-of-Code
Implementation of papers in 100 lines of code.
OML-Team/open-metric-learning
Metric learning and retrieval pipelines, models and zoo.
miguelvr/dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
DLTK/DLTK
Deep Learning Toolkit for Medical Image Analysis