dronefreak/NeuralFlight

Neural control framework for drones using motor imagery EEG classification. Achieves 73% cross-subject accuracy with PyTorch and enables hands-free drone control through imagined hand/feet movements.

38
/ 100
Emerging

This project offers a way to control drones using gestures or brain signals. You can use a webcam to track hand or head movements to fly a simulated drone. For advanced users, it processes real brainwave data (EEG) to let you control the drone with imagined movements. It's designed for researchers, accessibility innovators, or anyone interested in human-computer interaction beyond traditional controls.

Use this if you are a researcher in brain-computer interfaces, interested in accessibility tech, or want to experiment with novel drone control methods using common hardware like a webcam.

Not ideal if you need to control a physical drone immediately without any setup, or if you are looking for a commercial-grade, ready-to-deploy drone control system.

brain-computer-interface drone-control accessibility-tech human-computer-interaction motor-imagery
No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 13 / 25
Community 14 / 25

How are scores calculated?

Stars

11

Forks

3

Language

Python

License

Apache-2.0

Last pushed

Nov 21, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dronefreak/NeuralFlight"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.