kyegomez/MambaTransformer

Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling

57
/ 100
Established

This project helps AI developers build advanced language models that can understand and generate very long sequences of text more effectively. It takes in raw text or tokenized sequences and outputs predictions or generated text, suitable for tasks requiring deep understanding of extensive content. Developers working on sophisticated natural language processing applications will find this useful.

215 stars. Available on PyPI.

Use this if you are developing AI models that need to process or generate very long texts with high accuracy and improved reasoning, such as in advanced content generation or complex document analysis.

Not ideal if you are working with short, simple text sequences or if you prefer a standard, widely adopted transformer architecture for typical NLP tasks.

AI model development long-form text processing sequence generation natural language processing machine learning architecture
Maintenance 10 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 12 / 25

How are scores calculated?

Stars

215

Forks

16

Language

Python

License

MIT

Last pushed

Jan 30, 2026

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/MambaTransformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.