aim-uofa/StaMo

Unsupervised Learning of Generalizable Robot Motion from Compact State Representation

16
/ 100
Experimental

This project helps robotics researchers develop and train models that can learn to control robot movements from static images. You input existing robotic demonstration data, formatted as images and converted to JSON, and the output is a trained model capable of generating new, generalized robot motions. This tool is for robotics researchers and academics working on robot learning and control.

No commits in the last 6 months.

Use this if you need to train a robot motion generation model using unsupervised learning from a compact state representation of robotic demonstrations.

Not ideal if you need a pre-trained model for immediate deployment or are not comfortable with machine learning model training workflows.

robotics research robot motion generation unsupervised learning robot control robot learning
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 7 / 25
Community 0 / 25

How are scores calculated?

Stars

35

Forks

Language

Python

License

Last pushed

Oct 09, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/aim-uofa/StaMo"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.