dingo-actual/infini-transformer

PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)

40
/ 100
Emerging

This project offers a sophisticated deep learning model for working with extremely long text inputs in natural language processing. It takes large textual datasets, like books or very long documents, and processes them to enable tasks such as understanding content, answering questions, or generating new text. It is designed for researchers and engineers building advanced NLP systems that need to handle extensive contexts without memory limitations.

298 stars. No commits in the last 6 months.

Use this if you are developing AI models that need to process and understand very long text sequences, such as entire books, legal documents, or complex scientific papers, where traditional models struggle with memory constraints.

Not ideal if your primary need is processing short, fixed-length text snippets or if you are looking for an out-of-the-box solution without deep learning development experience.

Natural Language Processing Large Language Models Text Understanding AI Research Information Extraction
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

298

Forks

23

Language

Python

License

MIT

Last pushed

May 04, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dingo-actual/infini-transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.