iantangc/SelfHAR

Improving Human Activity Recognition through Self-training with Unlabeled Data

42
/ 100
Emerging

This project helps researchers and practitioners in human activity recognition (HAR) to more accurately classify human activities from mobile sensor data. It takes both a small amount of labeled sensor data and a large amount of unlabeled sensor data as input, and outputs a more robust activity recognition model. This is for anyone building or improving systems that automatically detect activities like walking, running, or sitting from device data.

No commits in the last 6 months.

Use this if you need to build a highly accurate human activity recognition model but have limited access to expensively labeled sensor data, alongside a wealth of readily available unlabeled sensor data.

Not ideal if you have ample labeled data for your specific activities and use case, or if you are not working with mobile sensor data for activity recognition.

human-activity-recognition mobile-sensing wearable-technology health-monitoring sports-analytics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

42

Forks

15

Language

Python

License

GPL-3.0

Last pushed

Jul 13, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/iantangc/SelfHAR"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.