LSTM-Human-Activity-Recognition and TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
The second project, "TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs," appears to be a complementary ecosystem sibling to the first, "LSTM-Human-Activity-Recognition," as it provides an iPython notebook and an Android app demonstrating the deployment of an LSTM model, potentially similar to the one built in the first project, onto an Android device.
About LSTM-Human-Activity-Recognition
guillaume-chevalier/LSTM-Human-Activity-Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
This project helps anyone working with sensor data from smartphones to automatically identify six common human activities: walking, walking upstairs, walking downstairs, sitting, standing, and laying. It takes raw accelerometer and gyroscope data as input and outputs a classification of the activity being performed. This is useful for researchers, product developers, or data analysts in fields like health, fitness, or behavioral science.
About TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
curiousily/TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
iPython notebook and Android app that shows how to build LSTM model in TensorFlow and deploy it on Android
This project helps you classify human activities using sensor data from a smartphone. You feed in raw accelerometer and gyroscope readings, and it tells you what activity a person is performing, such as walking or standing. This is useful for mobile app developers or researchers creating fitness trackers, health monitoring apps, or interactive sports applications.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work