LegendLeoChen/LeoRobot
ROS2 Humble: Use voice commands to make the robot grasp the target object. ROS2 humble版本仿真环境下,实现接收语音命令后通过大模型转为指令,控制轮式机器人完成检测目标、建图导航并抓取物体。
This project helps operations engineers and robotics researchers prototype and demonstrate mobile robot task execution using natural language commands. You speak an instruction to the robot, and it interprets the command to navigate, identify, and grasp target objects in a simulated environment. This is useful for anyone looking to quickly test and visualize voice-controlled robotic workflows.
No commits in the last 6 months.
Use this if you need to rapidly prototype and visualize a mobile robot performing pick-and-place tasks based on spoken commands in a simulated environment.
Not ideal if you require robust, real-world deployment on a physical robot or advanced multi-robot coordination.
Stars
34
Forks
6
Language
Python
License
MIT
Category
Last pushed
Jan 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/LegendLeoChen/LeoRobot"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
langfengQ/verl-agent
verl-agent is an extension of veRL, designed for training LLM/VLM agents via RL. verl-agent is...
sotopia-lab/sotopia
Sotopia: an Open-ended Social Learning Environment (ICLR 2024 spotlight)
zhudotexe/redel
ReDel is a toolkit for researchers and developers to build, iterate on, and analyze recursive...
TIGER-AI-Lab/verl-tool
A version of verl to support diverse tool use
AMAP-ML/Tree-GRPO
[ICLR 2026] Tree Search for LLM Agent Reinforcement Learning