DiFronzo/LSTM-for-Human-Activity-Recognition-classification
Deep convolutional and LSTM feature extraction approach with 784 features.
This project helps classify human movements like walking, sitting, or standing based on smartphone sensor data. It takes raw accelerometer and gyroscope readings from devices and outputs a prediction of the specific activity being performed. This is useful for researchers, fitness trackers, or applications that need to understand user activity.
Use this if you need to automatically categorize different human activities from smartphone sensor data, specifically among walking, walking upstairs, walking downstairs, sitting, and standing.
Not ideal if you need to recognize a wide range of complex activities beyond the listed basic movements or if your input data isn't from smartphone accelerometers and gyroscopes.
Stars
8
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/DiFronzo/LSTM-for-Human-Activity-Recognition-classification"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
OxWearables/stepcount
Improved Step Counting via Foundation Models for Wrist-Worn Accelerometers
OxWearables/actinet
An activity classification model based on self-supervised learning for wrist-worn accelerometer data.
aqibsaeed/Human-Activity-Recognition-using-CNN
Convolutional Neural Network for Human Activity Recognition in Tensorflow
felixchenfy/Realtime-Action-Recognition
Apply ML to the skeletons from OpenPose; 9 actions; multiple people. (WARNING: I'm sorry that...
guillaume-chevalier/LSTM-Human-Activity-Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM...