eth-siplab/EgoPoser
Official Code for ECCV 2024 paper "EgoPoser: Robust Real-Time Egocentric Pose Estimation from Sparse and Intermittent Observations Everywhere"
This project helps VR/AR developers create more realistic virtual avatars by estimating full-body movement from limited head and hand tracking data, even when hands are only intermittently visible. It takes headset-based hand and head tracking inputs and produces a natural, full-body pose for a virtual avatar. This is ideal for developers building immersive experiences, games, or social platforms in virtual or augmented reality.
No commits in the last 6 months.
Use this if you need to generate believable full-body avatar motion for VR/AR applications using only head and hand tracking, especially when hand tracking might be imperfect or temporary.
Not ideal if you require extremely precise, high-fidelity body tracking for motion capture in a studio environment with dedicated full-body sensors.
Stars
40
Forks
1
Language
Python
License
—
Category
Last pushed
Aug 28, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/eth-siplab/EgoPoser"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
DeepLabCut/DeepLabCut
Official implementation of DeepLabCut: Markerless pose estimation of user-defined features with...
openpifpaf/openpifpaf
Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and...
lambdaloop/anipose
🐜🐀🐒🚶 A toolkit for robust markerless 3D pose estimation
DIYer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
NeLy-EPFL/DeepFly3D
Motion capture (markerless 3D pose estimation) pipeline and helper GUI for tethered Drosophila.