devnithw/gesture-tinyml

Embedded machine learning for gesture recognition using Arduino Nano 33 BLE Sense. Includes source code for programming the board and notebooks for training the model.

25
/ 100
Experimental

This project helps engineers and hobbyists develop custom gesture recognition systems using an Arduino Nano 33 BLE Sense. You can record specific arm movements like a punch or flex, then use this data to train a tiny machine learning model. The output is a program that allows the Arduino board to classify and predict those gestures in real-time based on its accelerometer data.

No commits in the last 6 months.

Use this if you need to quickly prototype or implement a simple, low-power gesture recognition feature for an embedded device.

Not ideal if you require advanced gesture analysis, need to classify many complex gestures, or are working with different hardware platforms.

embedded-systems wearable-tech motion-control human-computer-interaction device-prototyping
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 13 / 25

How are scores calculated?

Stars

8

Forks

2

Language

C

License

Category

edge-camera-ml

Last pushed

Apr 25, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/devnithw/gesture-tinyml"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.