bagh2178/UniGoal
[CVPR 2025] UniGoal: Towards Universal Zero-shot Goal-oriented Navigation
This project helps roboticists and autonomous system developers enable robots to navigate to specific destinations without prior training in new environments. You provide images from a robot's camera or descriptive text for a goal, and the system generates the precise actions the robot needs to take to reach it. This is ideal for researchers and engineers developing next-generation autonomous robots for dynamic, unfamiliar settings.
311 stars. No commits in the last 6 months.
Use this if you need a robotic agent to navigate to a goal specified by an image or text in an unfamiliar indoor environment without extensive pre-training.
Not ideal if your robot operates exclusively in pre-mapped, static environments or if your navigation tasks require only simple waypoint following.
Stars
311
Forks
12
Language
Python
License
MIT
Category
Last pushed
Sep 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/bagh2178/UniGoal"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
col14m/cadrille
[ICLR2026] cadrille: Multi-modal CAD Reconstruction with Online Reinforcement Learning
filaPro/cad-recode
[ICCV2025] CAD-Recode: Reverse Engineering CAD Code from Point Clouds
pengsongyou/openscene
[CVPR'23] OpenScene: 3D Scene Understanding with Open Vocabularies
worldbench/3EED
[NeurIPS 2025 DB Track] 3EED: Ground Everything Everywhere in 3D
cambrian-mllm/cambrian-s
Cambrian-S: Towards Spatial Supersensing in Video