takumiw/Deep-Learning-for-Human-Activity-Recognition

Keras implementation of CNN, DeepConvLSTM, and SDAE and LightGBM for sensor-based Human Activity Recognition (HAR).

44
/ 100
Emerging

This project helps classify human activities like walking, standing, or sitting using data from smartphone sensors. It takes raw accelerometer and gyroscope readings and outputs a prediction of the activity being performed. This is useful for researchers or developers creating applications that need to understand user movement and behavior.

No commits in the last 6 months.

Use this if you need to identify specific human activities from smartphone sensor data and want to compare different deep learning and machine learning models for accuracy.

Not ideal if you are looking for real-time, production-ready solutions or if your activity recognition needs extend beyond the specific activities and sensor types covered.

activity-recognition wearable-tech sensor-data-analysis behavioral-analytics mobile-health
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

74

Forks

17

Language

Jupyter Notebook

License

MIT

Last pushed

Feb 16, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/takumiw/Deep-Learning-for-Human-Activity-Recognition"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.