kyegomez/Jamba

PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"

56
/ 100
Established

This project provides a flexible deep learning model for processing sequences of data, like text or time series. It combines different computational techniques to efficiently understand complex patterns. Machine learning engineers and researchers can use this to experiment with and build advanced language models or similar sequence processing applications.

208 stars. Available on PyPI.

Use this if you are a machine learning engineer or researcher looking to implement or experiment with a hybrid transformer-mamba architecture for sequence modeling.

Not ideal if you need an out-of-the-box solution for a specific application without any deep learning model development.

deep-learning natural-language-processing sequence-modeling pytorch-development ai-research
Maintenance 10 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 11 / 25

How are scores calculated?

Stars

208

Forks

14

Language

Python

License

MIT

Last pushed

Jan 17, 2026

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/Jamba"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.