zxz267/AvatarJLM

[ICCV 2023] Realistic Full-Body Tracking from Sparse Observations via Joint-Level Modeling

34
/ 100
Emerging

This project helps animators, game developers, or virtual reality content creators produce realistic full-body movements for 3D characters, even when only limited data from head and hand trackers is available. It takes sparse tracking signals from head and hands and outputs accurate, smooth, and believable full-body motion data that can be applied to 3D avatars. The primary users are professionals who need to create natural character animations efficiently.

No commits in the last 6 months.

Use this if you need to generate high-quality, realistic full-body animations for 3D characters using only input from head and hand tracking devices.

Not ideal if you already have dense, full-body motion capture data or if your application requires real-time, ultra-low latency tracking directly from live camera feeds without an intermediate processing step.

3D-animation virtual-reality-development motion-capture character-rigging game-design
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

52

Forks

5

Language

Python

License

MIT

Last pushed

Feb 29, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/zxz267/AvatarJLM"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.