LSTM-Human-Activity-Recognition and TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs

The second project, "TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs," appears to be a complementary ecosystem sibling to the first, "LSTM-Human-Activity-Recognition," as it provides an iPython notebook and an Android app demonstrating the deployment of an LSTM model, potentially similar to the one built in the first project, onto an Android device.

Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 10/25
Maturity 8/25
Community 24/25
Stars: 3,549
Forks: 938
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stars: 196
Forks: 96
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License:
Stale 6m No Package No Dependents
No License Stale 6m No Package No Dependents

About LSTM-Human-Activity-Recognition

guillaume-chevalier/LSTM-Human-Activity-Recognition

Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier

This project helps anyone working with sensor data from smartphones to automatically identify six common human activities: walking, walking upstairs, walking downstairs, sitting, standing, and laying. It takes raw accelerometer and gyroscope data as input and outputs a classification of the activity being performed. This is useful for researchers, product developers, or data analysts in fields like health, fitness, or behavioral science.

activity-recognition mobile-health behavioral-analytics sensor-data-analysis wearable-tech

About TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs

curiousily/TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs

iPython notebook and Android app that shows how to build LSTM model in TensorFlow and deploy it on Android

This project helps you classify human activities using sensor data from a smartphone. You feed in raw accelerometer and gyroscope readings, and it tells you what activity a person is performing, such as walking or standing. This is useful for mobile app developers or researchers creating fitness trackers, health monitoring apps, or interactive sports applications.

mobile-app-development fitness-tracking health-monitoring activity-recognition sensor-data-analysis

Scores updated daily from GitHub, PyPI, and npm data. How scores work