DiFronzo/LSTM-for-Human-Activity-Recognition-classification

Deep convolutional and LSTM feature extraction approach with 784 features.

38
/ 100
Emerging

This project helps classify human movements like walking, sitting, or standing based on smartphone sensor data. It takes raw accelerometer and gyroscope readings from devices and outputs a prediction of the specific activity being performed. This is useful for researchers, fitness trackers, or applications that need to understand user activity.

Use this if you need to automatically categorize different human activities from smartphone sensor data, specifically among walking, walking upstairs, walking downstairs, sitting, and standing.

Not ideal if you need to recognize a wide range of complex activities beyond the listed basic movements or if your input data isn't from smartphone accelerometers and gyroscopes.

Human Activity Recognition Fitness Tracking Behavioral Analysis Sensor Data Analysis Movement Classification
No Package No Dependents
Maintenance 10 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

8

Forks

1

Language

Jupyter Notebook

License

MIT

Last pushed

Mar 12, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/DiFronzo/LSTM-for-Human-Activity-Recognition-classification"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.