mbodiai/embodied-agents
Seamlessly integrate state-of-the-art transformer models into robotics stacks
This toolkit helps robotics engineers and researchers quickly integrate advanced AI models, specifically large multi-modal models like transformers, into their existing robot control systems. It takes in raw sensor data (e.g., images, depth scans) and language commands, and outputs robot movements or textual responses, making it easier to build intelligent, responsive robotic systems without extensive AI development expertise. It's designed for those developing or operating robots that need to understand and react to their environment and human instructions.
281 stars.
Use this if you are a robotics engineer or researcher looking to add sophisticated AI capabilities like visual perception, natural language understanding, and complex motor control to your robots, leveraging state-of-the-art models with minimal integration effort.
Not ideal if you are working with purely simulated environments and do not plan to deploy to physical robot hardware, or if your robot's tasks do not require advanced perception or natural language interaction.
Stars
281
Forks
32
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/agents/mbodiai/embodied-agents"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related agents
masamasa59/ai-agent-papers
A collection of AI Agents papers (Updated biweekly)
automatika-robotics/embodied-agents
EmbodiedAgents is a fully-loaded ROS2 based framework for creating interactive physical agents...
HKUSTDial/awesome-data-agents
Continuously updated paper list on advancements in Data Agents. Companion repo to our paper "A...