Prem-ium/Metahuman-Emotion-Recognition

Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).

37
/ 100
Emerging

This project helps create lifelike virtual characters that respond to human emotions in real-time. It takes a live video feed of a person's face and determines their emotional state (like happy, sad, or surprised). A 3D virtual human in Unreal Engine then mirrors these emotions, providing a highly interactive experience. This is ideal for virtual experience designers, character animators, and anyone creating immersive digital interactions.

No commits in the last 6 months.

Use this if you need virtual characters to realistically detect and mimic user emotions in real-time within an Unreal Engine environment.

Not ideal if you are looking for a standalone emotion analysis tool without integration into 3D virtual characters or Unreal Engine.

virtual-reality gaming character-animation human-computer-interaction digital-twins
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

49

Forks

7

Language

Python

License

MIT

Last pushed

Jan 10, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/Prem-ium/Metahuman-Emotion-Recognition"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.