mbodiai/embodied-agents

Seamlessly integrate state-of-the-art transformer models into robotics stacks

48
/ 100
Emerging

This toolkit helps robotics engineers and researchers quickly integrate advanced AI models, specifically large multi-modal models like transformers, into their existing robot control systems. It takes in raw sensor data (e.g., images, depth scans) and language commands, and outputs robot movements or textual responses, making it easier to build intelligent, responsive robotic systems without extensive AI development expertise. It's designed for those developing or operating robots that need to understand and react to their environment and human instructions.

281 stars.

Use this if you are a robotics engineer or researcher looking to add sophisticated AI capabilities like visual perception, natural language understanding, and complex motor control to your robots, leveraging state-of-the-art models with minimal integration effort.

Not ideal if you are working with purely simulated environments and do not plan to deploy to physical robot hardware, or if your robot's tasks do not require advanced perception or natural language interaction.

robotics-integration robot-control-systems autonomous-agents robot-perception human-robot-interaction
No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

281

Forks

32

Language

Python

License

Apache-2.0

Last pushed

Dec 16, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/agents/mbodiai/embodied-agents"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.