AsutoshPati/Ani-Emo-Eye
This project features a robot eye animation on a 0.96" OLED screen that dynamically changes based on the sentiment of the user's text input. Powered by the CAP10 Pratham board, this innovation integrates emotional intelligence into robotics, allowing bots to visually express emotions in real-time.
Ani-Emo-Eye helps give robots a more engaging personality by allowing them to visually express emotions. You speak or type text into a system, which then analyzes the sentiment of your words, and a small OLED screen displays a robot eye animation that changes to reflect that emotion. This is ideal for anyone developing interactive robots, educational tools, or prototypes where emotional expression adds value.
No commits in the last 6 months.
Use this if you want to add real-time emotional expression to a robot or interactive device through a simple, animated eye display.
Not ideal if you need complex facial expressions beyond a changing eye, or if your robot already has a sophisticated visual display for emotions.
Stars
31
Forks
5
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 07, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AsutoshPati/Ani-Emo-Eye"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
HumeAI/hume-api-examples
Example projects built with the Hume AI APIs
isseu/emotion-recognition-neural-networks
Emotion recognition using DNN with tensorflow
atulapra/Emotion-detection
Real-time Facial Emotion Detection using deep learning
amineHorseman/facial-expression-recognition-using-cnn
Deep facial expressions recognition using Opencv and Tensorflow. Recognizing facial expressions...
otaha178/Emotion-recognition
Real time emotion recognition