UT-Austin-RPL/PRELUDE

Official codebase for PRELUDE (Perceptive Locomotion Under Dynamic Environments)

35
/ 100
Emerging

This project helps roboticists design and implement robust navigation and walking behaviors for quadruped robots operating in cluttered, dynamic environments. You provide either human-controlled demonstrations or existing datasets of a robot's visual input and movement commands. The system then outputs trained navigation and gait controllers that enable the robot to perceive and react to its surroundings, traverse complex terrains, and avoid obstacles autonomously. This is intended for robotics researchers and engineers working on autonomous quadruped locomotion.

No commits in the last 6 months.

Use this if you need to develop highly agile and perceptive quadruped robots capable of navigating unpredictable real-world environments with moving obstacles.

Not ideal if you are working with wheeled robots, static environments, or if your primary focus is on fine-tuned motor control rather than high-level perception and navigation.

robotics quadruped-locomotion autonomous-navigation robot-perception robot-control
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

80

Forks

4

Language

Python

License

MIT

Last pushed

Aug 07, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/UT-Austin-RPL/PRELUDE"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.