kyegomez/CT
Implementation of the attention and transformer from "Building Blocks for a Complex-Valued Transformer Architecture"
This project offers an implementation of a specialized attention mechanism and transformer designed for complex-valued data. It takes in complex-valued signals or images, such as those from MRI scans or remote sensing, and processes them using a complex-valued attention mechanism. Scientists and researchers working with multi-dimensional or phase-sensitive image data would use this to improve model robustness.
No commits in the last 6 months.
Use this if you are developing machine learning models for complex-valued signals or images and want to reduce overfitting while maintaining performance.
Not ideal if your data consists solely of real numbers, as this specific implementation is tailored for complex-valued inputs.
Stars
8
Forks
1
Language
Python
License
MIT
Category
Last pushed
Mar 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kyegomez/CT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is...
thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models