eth-siplab/EgoPoser

Official Code for ECCV 2024 paper "EgoPoser: Robust Real-Time Egocentric Pose Estimation from Sparse and Intermittent Observations Everywhere"

20
/ 100
Experimental

This project helps VR/AR developers create more realistic virtual avatars by estimating full-body movement from limited head and hand tracking data, even when hands are only intermittently visible. It takes headset-based hand and head tracking inputs and produces a natural, full-body pose for a virtual avatar. This is ideal for developers building immersive experiences, games, or social platforms in virtual or augmented reality.

No commits in the last 6 months.

Use this if you need to generate believable full-body avatar motion for VR/AR applications using only head and hand tracking, especially when hand tracking might be imperfect or temporary.

Not ideal if you require extremely precise, high-fidelity body tracking for motion capture in a studio environment with dedicated full-body sensors.

virtual-reality augmented-reality avatar-animation human-computer-interaction game-development
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 3 / 25

How are scores calculated?

Stars

40

Forks

1

Language

Python

License

Last pushed

Aug 28, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/eth-siplab/EgoPoser"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.