alihassanml/Yolo11-Face-Emotion-Detection

This project implements a face emotion detection system using YOLOv11, trained on a custom dataset to classify emotions into five distinct classes. The model utilizes the Ultralytics YOLO framework for real-time inference.

29
/ 100
Experimental

This project helps you instantly identify emotions on faces captured by a camera. It takes live video feed or images as input and outputs the detected emotions (like happy, sad, or angry) overlaid on each face. This is ideal for researchers studying human behavior, educators, or anyone needing to analyze emotional responses in real-time.

No commits in the last 6 months.

Use this if you need to automatically detect and classify one of five basic human emotions from faces in live video or image streams.

Not ideal if you require detection of a wider range of emotions beyond the five basic classes or need to analyze emotions from non-face data.

emotion-recognition behavior-analysis human-computer-interaction psychology-research video-analytics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

29

Forks

2

Language

Jupyter Notebook

License

MIT

Last pushed

Oct 26, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/alihassanml/Yolo11-Face-Emotion-Detection"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.