CV-ZMH/human-action-recognition
Multi Person Skeleton Based Action Recognition and Tracking
This project helps security and surveillance professionals automatically identify and track human actions in real-time video feeds. It takes live video or recorded footage as input and outputs a labeled stream indicating actions like 'walk,' 'run,' 'jump,' or 'fight' for multiple people simultaneously. Law enforcement, retail security, or event organizers could use this to monitor behavior.
165 stars. No commits in the last 6 months.
Use this if you need to automatically detect and classify common human actions within video, especially in scenarios with multiple people.
Not ideal if you require recognition of highly specialized or nuanced actions, or if your primary goal is facial recognition.
Stars
165
Forks
32
Language
Python
License
MIT
Category
Last pushed
Jan 26, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/CV-ZMH/human-action-recognition"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
quic/sense
Enhance your application with the ability to see and interact with humans using any RGB camera.
AlexanderMelde/SPHAR-Dataset
Surveillance Perspective Human Action Recognition Dataset: 7759 Videos from 14 Action Classes,...
Event-AHU/HARDVS
[AAAI-2024] HARDVS: Revisiting Human Activity Recognition with Dynamic Vision Sensors
mmact19/2019
MMAct: A Large-Scale Dataset for Cross Modal Learning on Human Action Understanding
yujmo/CZU_MHAD
CZU-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and 10...