G0rav/Human_Activity_Recognition

The MHEALTH (Mobile Health) dataset is devised to benchmark techniques dealing with human behavior analysis based on multimodal body sensing.

27
/ 100
Experimental

This project helps researchers and healthcare professionals automatically identify a person's physical activity from wearable sensor data. It takes raw body motion and vital sign recordings, like acceleration and ECG from chest, wrist, and ankle sensors, and outputs labels such as 'walking', 'cycling', or 'lying down'. It's ideal for anyone studying human behavior or monitoring patient activity outside of a clinical setting.

No commits in the last 6 months.

Use this if you need to classify specific human activities from multimodal wearable sensor data, especially for health monitoring or behavioral research.

Not ideal if you need to analyze activities beyond the pre-defined set, or if you only have a single sensor type rather than a multi-sensor setup.

human-behavior-analysis mobile-health activity-recognition wearable-technology patient-monitoring
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 13 / 25

How are scores calculated?

Stars

24

Forks

4

Language

Jupyter Notebook

License

Last pushed

Mar 10, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/G0rav/Human_Activity_Recognition"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.