kyegomez/VortexFusion

Transformers + Mambas + LSTMS All in One Model

37
/ 100
Emerging

This project offers a novel deep learning model architecture that combines the strengths of Mamba, Transformer, and LSTM networks. It takes in sequential data, typically numerical representations of text, audio, or other time-series information, and processes it to produce an enhanced sequence output. This is designed for machine learning researchers and practitioners who are experimenting with advanced model designs for sequence-based tasks.

Use this if you are a machine learning researcher or engineer exploring cutting-edge, hybrid model architectures for sequence processing and want to experiment with combining Mamba, Transformer, and LSTM layers.

Not ideal if you are a business user looking for a ready-to-deploy, pre-trained solution for a specific application without getting into model architecture design.

deep-learning-research neural-network-architecture sequence-modeling machine-learning-engineering ai-experimentation
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

14

Forks

1

Language

Python

License

MIT

Last pushed

Mar 09, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/VortexFusion"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.