foivospar/NED

PyTorch implementation for NED (CVPR 2022). It can be used to manipulate the facial emotions of actors in videos based on emotion labels or reference styles.

44
/ 100
Emerging

This project helps video creators, such as filmmakers or animators, modify the emotional expressions on actors' faces in existing video footage. You input a video of a person speaking or expressing themselves, along with desired emotion labels like "happy" or "angry," or a reference video showing the desired style. The system then outputs a new video where the actor's face conveys the new emotions, all while preserving the original speech and mouth movements.

160 stars. No commits in the last 6 months.

Use this if you need to alter the emotional performance of an actor in a video without re-shooting or losing the original dialogue, for applications like movie post-production, video games, or creating realistic avatars.

Not ideal if you need to generate entirely new facial movements or gestures beyond emotion, or if you require real-time manipulation for live applications.

video-editing film-post-production character-animation digital-avatars affective-computing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

160

Forks

26

Language

Python

License

MIT

Last pushed

Oct 06, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/foivospar/NED"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.