pjyazdian/Gesture2Vec

This is an official PyTorch implementation of "Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation" (IROS 2022).

36
/ 100
Emerging

This project helps researchers and animators automatically generate realistic co-speech gestures for virtual characters or digital avatars. You provide text, and it outputs a sequence of discrete, human-like gesture chunks that enhance the naturalness and diversity of character animation. It is ideal for those working in animation, virtual reality, or human-computer interaction.

No commits in the last 6 months.

Use this if you need to create convincing and diverse hand gestures that naturally accompany spoken language for animated characters or virtual agents.

Not ideal if you are looking for a tool to analyze existing human gesture data without generating new gestures, or if you require non-speech-related body movements.

character-animation virtual-reality human-computer-interaction digital-avatars gesture-synthesis
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

26

Forks

4

Language

Python

License

MIT

Last pushed

Feb 09, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/pjyazdian/Gesture2Vec"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.