zhoupingjay/LlamaPi
Raspberry Pi Voice Chatbot backed by local or cloud LLM, with Robot Arm gestures
This project helps you create a voice-controlled chatbot with a physical robot arm using a Raspberry Pi. You speak to it, and it responds both verbally and with gestures from the robot arm, powered by an AI language model. It's designed for enthusiasts or educators interested in combining AI, voice interaction, and robotics.
No commits in the last 6 months.
Use this if you want to build an interactive, voice-controlled robot arm using a Raspberry Pi, where the robot can understand natural language and respond with both speech and physical movements.
Not ideal if you need a high-performance, industrial-grade robot arm controller or a production-ready conversational AI system.
Stars
22
Forks
2
Language
Python
License
MIT
Category
Last pushed
Oct 30, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/zhoupingjay/LlamaPi"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
bigsk1/voice-chat-ai
🎙️ Speak with AI - Run locally using Ollama, OpenAI, Anthropic or xAI - Speech uses SparkTTS,...
digiteinfotech/kairon
Agentic AI platform that harnesses Visual LLM Chaining to build proactive digital assistants
AmberSahdev/Open-Interface
Control Any Computer Using LLMs.
second-state/echokit_server
Open Source Voice Agent Platform
withcatai/catai
Run AI ✨ assistant locally! with simple API for Node.js 🚀