Quantinuum/Quixer

Code repository for the preprint "Quixer: A Quantum Transformer Model"

43
/ 100
Emerging

Quixer helps quantum machine learning researchers and practitioners explore a novel quantum transformer model architecture. It takes in textual data (or other sequential data types) and outputs predictions based on the learned patterns, similar to classical deep learning models. This project is for researchers developing or evaluating advanced machine learning models, particularly those interested in quantum computing's application to sequence processing.

No commits in the last 6 months.

Use this if you are a quantum machine learning researcher or a deep learning practitioner interested in exploring the practical implementation and simulation of a quantum-inspired transformer model alongside classical baselines.

Not ideal if you are looking for a pre-trained, production-ready model for immediate application, or if you do not have a strong background in deep learning and quantum computing concepts.

quantum machine learning natural language processing research sequence modeling deep learning architectures quantum computing applications
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

36

Forks

14

Language

Python

License

Apache-2.0

Last pushed

Jun 20, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Quantinuum/Quixer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.