kyegomez/HLT

Implementation of the transformer from the paper: "Real-World Humanoid Locomotion with Reinforcement Learning"

48
/ 100
Emerging

This project helps robotics engineers and researchers working on advanced humanoid robots. It takes sensor observations (like video feeds) and high-level instructions as input. The output is a sequence of detailed actions for the robot to execute, enabling it to move realistically and adaptively in real-world environments.

Use this if you are developing reinforcement learning systems for humanoid robots and need a model that can process visual inputs and abstract commands to generate complex movement patterns.

Not ideal if you are working with non-humanoid robots or simpler control tasks that don't require advanced vision-to-action translation.

humanoid-robotics robot-locomotion reinforcement-learning robot-control autonomous-systems
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

62

Forks

9

Language

Python

License

MIT

Last pushed

Jan 31, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/HLT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.