Blade6570/icface
ICface: Interpretable and Controllable Face Reenactment Using GANs
This tool helps animators, content creators, or visual effects artists transfer facial expressions and head movements from one person to another in a video. You provide a static image of a person's face (the source) and a video of another person performing expressions (the driver). The output is a new video where the source face realistically animates with the driver's expressions and movements.
164 stars. No commits in the last 6 months.
Use this if you need to realistically reenact or animate a specific face using the expressions and head movements from a different video performance.
Not ideal if you need to generate entirely new facial expressions or perform complex 3D manipulations of a face beyond reenactment.
Stars
164
Forks
26
Language
Python
License
—
Category
Last pushed
Jul 10, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/Blade6570/icface"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
HaiyuWu/Vec2Face
This is the official implementation of "Vec2Face: Scaling Face Dataset Generation with Loosely...
anisha2102/sketch2face
Conversion of sketches to photos using GANs.
ivanvovk/controllable-face-generation
Controllable Face Generation via pretrained Conditional Adversarial Latent Autoencoder (ALAE)
bryandlee/naver-webtoon-faces
Generative models on NAVER Webtoon faces
bryandlee/malnyun_faces
침착한 생성모델 학습기