studerus/pepper-android-realtime-chat

Open-source Android framework for low-latency, LLM-driven multimodal interaction on Pepper. Uses end-to-end speech-to-speech models and extensive Function Calling for agentic robot control (navigation, gaze, vision, touch). Also runs on regular Android devices.

30
/ 100
Emerging

This project creates a highly interactive and natural communication system for the Pepper robot, allowing it to engage in conversations, understand touch and vision, and move autonomously. It takes spoken requests, touch input, and visual information to control the robot's actions, provide information, or play interactive games. This is for researchers, educators, or businesses who want to deploy an advanced AI assistant on a Pepper robot.

Use this if you need a Pepper robot to have lifelike, real-time voice conversations and intelligently respond to its environment through movement, vision, and touch.

Not ideal if you are looking for a general-purpose AI chatbot that doesn't involve physical robot interaction or require multimodal sensory input.

robotics human-robot-interaction AI assistant customer engagement education
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 15 / 25
Community 0 / 25

How are scores calculated?

Stars

10

Forks

Language

Kotlin

License

Last pushed

Mar 10, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/studerus/pepper-android-realtime-chat"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.