stevenlsw/hoi-forecast

[CVPR 2022] Joint hand motion and interaction hotspots prediction from egocentric videos

38
/ 100
Emerging

This project helps researchers and developers working with egocentric video analysis predict future human actions. Given a sequence of past video frames captured from a first-person perspective, it outputs predictions for where a person's hands will move and which objects they are likely to interact with, highlighting potential 'hotspots' of interaction. It's designed for computer vision scientists and AI engineers focused on understanding human-object interaction.

No commits in the last 6 months.

Use this if you need to automatically generate training data for hand motion and object interaction prediction from egocentric video datasets like Epic-Kitchens, or if you want to evaluate existing models for these tasks.

Not ideal if you are looking for a plug-and-play solution for real-time human activity recognition without prior experience in machine learning model training or data preparation for computer vision tasks.

egocentric-video-analysis human-computer-interaction-prediction activity-forecasting computer-vision-research object-interaction-hotspot-detection
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

71

Forks

9

Language

Python

License

MIT

Last pushed

Jan 29, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/stevenlsw/hoi-forecast"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.