maceq687/FullBodyPoseEstimation
Full body pose estimation to be used with HMD (Quest2) built in Unity
This project helps VR content creators and animators bring full-body avatars to life in virtual reality environments using affordable equipment. It takes input from a Quest2 HMD, its controllers, and either a single webcam, multiple webcams, or a Kinect v2 sensor. The output is a fully controllable avatar within Unity that mirrors the user's head, hand, and full-body movements.
No commits in the last 6 months.
Use this if you need to integrate realistic full-body motion tracking into a Unity VR application or game, using readily available cameras and an HMD.
Not ideal if you require extremely high-fidelity, professional-grade motion capture without any VR integration, or if you prefer a system that doesn't rely on webcams or Kinect.
Stars
91
Forks
9
Language
C#
License
GPL-3.0
Category
Last pushed
Dec 04, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/maceq687/FullBodyPoseEstimation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
stereolabs/zed-unity
ZED SDK Unity plugin
CMU-Perceptual-Computing-Lab/openpose_unity_plugin
OpenPose's Unity Plugin for Unity users
Unity-Technologies/com.unity.perception
Perception toolkit for sim2real training and validation in Unity
evo-biomech/replicAnt
replicAnt - generating annotated images of animals in complex environments with Unreal Engine
Unity-Technologies/SynthDet
SynthDet - An end-to-end object detection pipeline using synthetic data