liangxuy/ReGenNet
[CVPR 2024] Official implementation of the paper "ReGenNet: Towards Human Action-Reaction Synthesis"
This project helps create realistic animations of two people interacting, where one person reacts to the other's actions. You input a sequence of human poses (like from motion capture) for an 'actor' and it generates a corresponding pose sequence for a 'reactor'. This is ideal for animators, game developers, or researchers who need to simulate natural human social interactions.
No commits in the last 6 months.
Use this if you need to automatically generate how one virtual human character would realistically react to the specific movements of another.
Not ideal if you're looking for a tool to generate single-person movements or highly stylized, non-realistic human animations.
Stars
68
Forks
5
Language
Python
License
MIT
Category
Last pushed
Sep 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/liangxuy/ReGenNet"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
hao-ai-lab/FastVideo
A unified inference and post-training framework for accelerated video generation.
ModelTC/LightX2V
Light Image Video Generation Inference Framework
thu-ml/TurboDiffusion
TurboDiffusion: 100–200× Acceleration for Video Diffusion Models
PKU-YuanGroup/Helios
Helios: Real Real-Time Long Video Generation Model
PKU-YuanGroup/MagicTime
[TPAMI 2025🔥] MagicTime: Time-lapse Video Generation Models as Metamorphic Simulators