haoliuhl/instructrl

Instruction Following Agents with Multimodal Transforemrs

34
/ 100
Emerging

This project helps robotics researchers and engineers train robots to follow complex instructions by learning from visual and language data. You input video demonstrations of robot actions paired with descriptive text instructions, and the system outputs a trained robot control policy that can execute new, similar commands. This is primarily for those developing intelligent robotic systems capable of understanding and responding to human language.

No commits in the last 6 months.

Use this if you need to develop robot agents that can interpret and execute natural language instructions based on visual observations.

Not ideal if your robotics tasks do not involve visual or linguistic instructions, or if you are not working with deep learning models for robot control.

robotics robot-learning embodied-AI instruction-following human-robot-interaction
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

53

Forks

5

Language

Python

License

MIT

Last pushed

Nov 03, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/haoliuhl/instructrl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.