G0rav/Human_Activity_Recognition
The MHEALTH (Mobile Health) dataset is devised to benchmark techniques dealing with human behavior analysis based on multimodal body sensing.
This project helps researchers and healthcare professionals automatically identify a person's physical activity from wearable sensor data. It takes raw body motion and vital sign recordings, like acceleration and ECG from chest, wrist, and ankle sensors, and outputs labels such as 'walking', 'cycling', or 'lying down'. It's ideal for anyone studying human behavior or monitoring patient activity outside of a clinical setting.
No commits in the last 6 months.
Use this if you need to classify specific human activities from multimodal wearable sensor data, especially for health monitoring or behavioral research.
Not ideal if you need to analyze activities beyond the pre-defined set, or if you only have a single sensor type rather than a multi-sensor setup.
Stars
24
Forks
4
Language
Jupyter Notebook
License
—
Category
Last pushed
Mar 10, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/G0rav/Human_Activity_Recognition"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
OxWearables/stepcount
Improved Step Counting via Foundation Models for Wrist-Worn Accelerometers
OxWearables/actinet
An activity classification model based on self-supervised learning for wrist-worn accelerometer data.
aqibsaeed/Human-Activity-Recognition-using-CNN
Convolutional Neural Network for Human Activity Recognition in Tensorflow
felixchenfy/Realtime-Action-Recognition
Apply ML to the skeletons from OpenPose; 9 actions; multiple people. (WARNING: I'm sorry that...
guillaume-chevalier/LSTM-Human-Activity-Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM...