eth-siplab/AvatarPoser
Official Code for ECCV 2022 paper "AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing"
This project helps create realistic full-body avatars in virtual and augmented reality environments using only head and hand movements. It takes sparse motion data from standard VR/AR headsets as input and outputs a complete, natural full-body pose for a virtual avatar. This is for game developers, metaverse content creators, and researchers building interactive virtual experiences.
324 stars. No commits in the last 6 months.
Use this if you need to animate full-body avatars in real-time within VR/AR applications without requiring additional body trackers.
Not ideal if you require extremely high-fidelity, biomechanically exact motion capture for specialized applications like medical analysis or professional animation.
Stars
324
Forks
56
Language
Python
License
MIT
Category
Last pushed
Feb 20, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/eth-siplab/AvatarPoser"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
DeepLabCut/DeepLabCut
Official implementation of DeepLabCut: Markerless pose estimation of user-defined features with...
openpifpaf/openpifpaf
Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and...
lambdaloop/anipose
🐜🐀🐒🚶 A toolkit for robust markerless 3D pose estimation
DIYer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
NeLy-EPFL/DeepFly3D
Motion capture (markerless 3D pose estimation) pipeline and helper GUI for tethered Drosophila.