AsutoshPati/Ani-Emo-Eye

This project features a robot eye animation on a 0.96" OLED screen that dynamically changes based on the sentiment of the user's text input. Powered by the CAP10 Pratham board, this innovation integrates emotional intelligence into robotics, allowing bots to visually express emotions in real-time.

36
/ 100
Emerging

Ani-Emo-Eye helps give robots a more engaging personality by allowing them to visually express emotions. You speak or type text into a system, which then analyzes the sentiment of your words, and a small OLED screen displays a robot eye animation that changes to reflect that emotion. This is ideal for anyone developing interactive robots, educational tools, or prototypes where emotional expression adds value.

No commits in the last 6 months.

Use this if you want to add real-time emotional expression to a robot or interactive device through a simple, animated eye display.

Not ideal if you need complex facial expressions beyond a changing eye, or if your robot already has a sophisticated visual display for emotions.

robotics human-robot-interaction educational-robotics prototype-development embedded-systems
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

31

Forks

5

Language

Jupyter Notebook

License

MIT

Last pushed

Sep 07, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AsutoshPati/Ani-Emo-Eye"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.