aljazfrancic/myo-readings-dataset
Myo armband machine learning EMG dataset of various hand gestures
This project provides a collection of raw muscle activity readings (EMG data) from the Myo armband for various hand and wrist gestures. It helps researchers and engineers develop and test algorithms that recognize specific hand movements, by offering pre-recorded, labeled data. The dataset includes readings for actions like flexing, extending, and making a fist, intended for those working on human-computer interaction or prosthetic control.
Use this if you need a pre-collected, labeled dataset of electromyography (EMG) signals from the Myo armband to train and evaluate machine learning models for hand gesture recognition.
Not ideal if you need a dataset that includes a wider range of activities beyond specific wrist and hand gestures or if you require data from different types of EMG sensors.
Stars
17
Forks
44
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aljazfrancic/myo-readings-dataset"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
mne-tools/mne-python
MNE: Magnetoencephalography (MEG) and Electroencephalography (EEG) in Python
braindecode/braindecode
Deep learning software to decode EEG, ECG or MEG signals
NeuroTechX/moabb
Mother of All BCI Benchmarks
neuromodulation/py_neuromodulation
Real-time analysis of intracranial neurophysiology recordings.
IoBT-VISTEC/MIN2Net
End-to-End Multi-Task Learning for Subject-Independent Motor Imagery EEG Classification (IEEE...