kyegomez/Jamba
PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"
This project provides a flexible deep learning model for processing sequences of data, like text or time series. It combines different computational techniques to efficiently understand complex patterns. Machine learning engineers and researchers can use this to experiment with and build advanced language models or similar sequence processing applications.
208 stars. Available on PyPI.
Use this if you are a machine learning engineer or researcher looking to implement or experiment with a hybrid transformer-mamba architecture for sequence modeling.
Not ideal if you need an out-of-the-box solution for a specific application without any deep learning model development.
Stars
208
Forks
14
Language
Python
License
MIT
Category
Last pushed
Jan 17, 2026
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/Jamba"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
NVlabs/MambaVision
[CVPR 2025] Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
sign-language-translator/sign-language-translator
Python library & framework to build custom translators for the hearing-impaired and translate...
autonomousvision/transfuser
[PAMI'23] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving;...
kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance...
dali92002/DocEnTR
DocEnTr: An end-to-end document image enhancement transformer - ICPR 2022