andreaconti/sparsity-agnostic-depth-completion
Depth Completion technique agnostic to input depth pattern sparsity, WACV23
This tool helps autonomous vehicle developers, robotics engineers, and 3D reconstruction specialists fill in incomplete 3D depth information from sensors like LiDAR or depth cameras. You input a sparse depth map (a 2D image where most pixels have no depth data) and a corresponding color image, and it outputs a complete, dense depth map. This is especially useful for applications where sensor readings might be very sparse or inconsistent.
No commits in the last 6 months.
Use this if you need to generate accurate, full depth maps from highly sparse or variably dense sensor inputs for applications like autonomous navigation or 3D scene understanding.
Not ideal if your depth inputs are consistently dense and complete, or if you need to process only standard, evenly distributed depth data.
Stars
33
Forks
1
Language
Python
License
—
Category
Last pushed
Nov 23, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/andreaconti/sparsity-agnostic-depth-completion"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
fangchangma/sparse-to-dense
ICRA 2018 "Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image"...
Aradhye2002/EcoDepth
[CVPR'2024] Official implementation of the paper "ECoDepth: Effective Conditioning of Diffusion...
fangchangma/sparse-to-dense.pytorch
ICRA 2018 "Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image"...
ShuweiShao/MonoDiffusion
[TCSVT2024] MonoDiffusion: Self-Supervised Monocular Depth Estimation Using Diffusion Model
albert100121/AiFDepthNet
Official Pytorch implementation of ICCV 2021 2020 paper "Bridging Unsupervised and Supervised...