asappresearch/sru

Training RNNs as Fast as CNNs (https://arxiv.org/abs/1709.02755)

48
/ 100
Emerging

This is a recurrent neural network component designed to speed up the training of natural language processing (NLP) models. It takes sequences of data, like text or speech embeddings, as input and processes them much faster than traditional LSTM networks while maintaining accuracy. It is ideal for machine learning engineers and researchers who are building and training NLP models.

2,112 stars. No commits in the last 6 months.

Use this if you need to train large NLP models more quickly and efficiently, especially when dealing with sequential data.

Not ideal if your primary task does not involve sequential data processing or if you are not working with deep learning models.

natural-language-processing machine-learning-engineering deep-learning-training sequence-modeling computational-linguistics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

2,112

Forks

305

Language

Python

License

MIT

Last pushed

Jan 04, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/asappresearch/sru"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.