tatp22/linformer-pytorch

My take on a practical implementation of Linformer for Pytorch.

51
/ 100
Established

This project offers a highly efficient way to process extremely long text sequences, like entire books or extensive codebases, using a specialized type of neural network. It takes in lengthy text or sequential data and produces analyzed outputs, such as predictions for the next word or complex feature representations, much faster than traditional methods. Data scientists and machine learning engineers working with large-scale natural language processing or sequential data tasks will find this particularly useful.

422 stars. Used by 1 other package. No commits in the last 6 months. Available on PyPI.

Use this if you need to build language models or sequence processors that can handle input sequences of millions of tokens without prohibitive computational costs.

Not ideal if your primary concern is interpretability or if you are working with very short sequences where the performance gains of linear attention are negligible.

natural-language-processing large-scale-text-analysis sequence-modeling machine-learning-engineering
Stale 6m
Maintenance 0 / 25
Adoption 11 / 25
Maturity 25 / 25
Community 15 / 25

How are scores calculated?

Stars

422

Forks

37

Language

Python

License

MIT

Last pushed

Jul 27, 2022

Commits (30d)

0

Dependencies

2

Reverse dependents

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/tatp22/linformer-pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.