kyegomez/SimpleMamba

Implementation of a modular, high-performance, and simplistic mamba for high-speed applications

26
/ 100
Experimental

This is a tool for machine learning engineers and researchers who are building or experimenting with new deep learning models. It provides a modular, high-performance building block called a 'Mamba block' and a State Space Model (SSM) that can be integrated into larger neural networks. You provide model parameters and input data, and it outputs processed tensors, which are ready for further use in your model architecture.

No commits in the last 6 months.

Use this if you are a machine learning practitioner looking to incorporate Mamba architecture components or a full State Space Model into your deep learning projects for sequence processing.

Not ideal if you are an end-user looking for an out-of-the-box application for tasks like natural language generation or image classification, as this is a foundational model building tool.

deep-learning-architecture sequence-modeling neural-network-components machine-learning-engineering model-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 3 / 25

How are scores calculated?

Stars

40

Forks

1

Language

Python

License

MIT

Last pushed

Nov 11, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/SimpleMamba"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.