menyifang/MIMO

Official implementation of "MIMO: Controllable Character Video Synthesis with Spatial Decomposed Modeling"

43
/ 100
Emerging

This tool helps animators and content creators generate realistic character videos from simple inputs. You provide an image of a character, along with motion data (like a 3D pose or another video), and it outputs a new video of your character performing that motion within a scene. This is ideal for professionals in animation, game development, or digital content creation who need to quickly prototype character movements.

1,575 stars. No commits in the last 6 months.

Use this if you need to animate a static character image with custom movements or integrate a character into a dynamic scene, without extensive manual rigging or frame-by-frame animation.

Not ideal if you need to generate highly precise, physics-based simulations or require pixel-level control over every aspect of a character's interaction with complex environments.

character-animation video-synthesis motion-graphics digital-human-creation content-production
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

1,575

Forks

70

Language

Python

License

Apache-2.0

Last pushed

Jun 19, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/menyifang/MIMO"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.