AliaksandrSiarohin/monkey-net
Animating Arbitrary Objects via Deep Motion Transfer
This project helps animators and content creators bring static images to life by transferring motion from a driving video. You provide a static image of an object and a video showing the desired motion, and it generates a new video where your object moves like the one in the driving video. This is ideal for artists, marketers, or anyone creating visual content who needs to animate specific images without complex manual animation.
477 stars. No commits in the last 6 months.
Use this if you need to quickly animate a still image by applying movement patterns from an existing video, such as making a photo of a person talk like someone in a video, or a drawing of an animal move like a real one.
Not ideal if you need fine-grained control over individual movements or want to generate completely novel motions not present in a driving video, as this focuses on transferring existing motion.
Stars
477
Forks
78
Language
Python
License
—
Category
Last pushed
Nov 22, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AliaksandrSiarohin/monkey-net"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pypose/pypose
A library for differentiable robotics on manifolds.
MarcoForte/FBA_Matting
Official repository for the paper F, B, Alpha Matting
snap-research/articulated-animation
Code for Motion Representations for Articulated Animation paper
foamliu/Deep-Image-Matting
Deep Image Matting
dyelax/Adversarial_Video_Generation
A TensorFlow Implementation of "Deep Multi-Scale Video Prediction Beyond Mean Square Error" by...