masashi-hatano/EgoH4

Official code releasse for "The Invisible EgoHand: 3D Hand Forecasting through EgoBody Pose Estimation"

31
/ 100
Emerging

This project helps researchers in computer vision and robotics analyze human motion by predicting future 3D hand poses from an ego-centric view of the body. It takes in video or sensor data depicting a person's body movements and outputs future 3D coordinates of their hands. It's designed for academics and engineers developing applications like human-computer interaction, virtual reality, or action recognition.

No commits in the last 6 months.

Use this if you need to anticipate hand movements in 3D space based on observed body posture, particularly from a first-person perspective.

Not ideal if you're looking for a tool to track hands in real-time from an external viewpoint or for general-purpose object detection.

human-computer interaction robotics computer vision motion prediction virtual reality
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

30

Forks

2

Language

Python

License

MIT

Last pushed

Aug 19, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/masashi-hatano/EgoH4"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.