takumiw/Deep-Learning-for-Human-Activity-Recognition
Keras implementation of CNN, DeepConvLSTM, and SDAE and LightGBM for sensor-based Human Activity Recognition (HAR).
This project helps classify human activities like walking, standing, or sitting using data from smartphone sensors. It takes raw accelerometer and gyroscope readings and outputs a prediction of the activity being performed. This is useful for researchers or developers creating applications that need to understand user movement and behavior.
No commits in the last 6 months.
Use this if you need to identify specific human activities from smartphone sensor data and want to compare different deep learning and machine learning models for accuracy.
Not ideal if you are looking for real-time, production-ready solutions or if your activity recognition needs extend beyond the specific activities and sensor types covered.
Stars
74
Forks
17
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Feb 16, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/takumiw/Deep-Learning-for-Human-Activity-Recognition"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
OxWearables/stepcount
Improved Step Counting via Foundation Models for Wrist-Worn Accelerometers
OxWearables/actinet
An activity classification model based on self-supervised learning for wrist-worn accelerometer data.
guillaume-chevalier/LSTM-Human-Activity-Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM...
felixchenfy/Realtime-Action-Recognition
Apply ML to the skeletons from OpenPose; 9 actions; multiple people. (WARNING: I'm sorry that...
aqibsaeed/Human-Activity-Recognition-using-CNN
Convolutional Neural Network for Human Activity Recognition in Tensorflow