coperception/star
[CoRL 2022] Multi Robot Scene Completion
This project helps self-driving cars and other robotic systems understand their surroundings more completely by sharing sensor data among multiple robots. It takes in partial views from individual robots and combines them to create a full, detailed picture of the environment. This is ideal for robotics engineers and researchers developing robust autonomous perception systems.
No commits in the last 6 months.
Use this if you need multiple robots to collaboratively build a comprehensive understanding of their shared environment for various autonomous tasks.
Not ideal if your robots operate independently without needing to share perception data, or if you require a task-specific perception solution.
Stars
38
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Nov 29, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/coperception/star"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vita-epfl/monoloco
A 3D vision library from 2D keypoints: monocular and stereo 3D detection for humans, social...
fangchangma/self-supervised-depth-completion
ICRA 2019 "Self-supervised Sparse-to-Dense: Self-supervised Depth Completion from LiDAR and...
nburrus/stereodemo
Small Python utility to compare and visualize the output of various stereo depth estimation algorithms
JiawangBian/sc_depth_pl
SC-Depth (V1, V2, and V3) for Unsupervised Monocular Depth Estimation ...
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st...