alexklwong/calibrated-backprojection-network
PyTorch Implementation of Unsupervised Depth Completion with Calibrated Backprojection Layers (ORAL, ICCV 2021)
This project helps convert sparse 3D point cloud data, often obtained from sensors like LIDAR or Structure-from-Motion, into a complete and dense 3D representation. It takes an image, a sparse point cloud (or sparse depth map), and camera calibration parameters as input, and outputs a refined, dense point cloud. This is useful for robotics engineers, autonomous vehicle developers, or anyone working with 3D scene understanding where detailed environmental mapping is critical.
129 stars. No commits in the last 6 months.
Use this if you need to reliably reconstruct full 3D scenes from limited, scattered depth measurements and corresponding images, even when using different sensor platforms.
Not ideal if your application requires depth estimation without any initial sparse point cloud data or if you primarily work with single 2D images.
Stars
129
Forks
24
Language
Python
License
—
Category
Last pushed
Oct 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/alexklwong/calibrated-backprojection-network"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vita-epfl/monoloco
A 3D vision library from 2D keypoints: monocular and stereo 3D detection for humans, social...
fangchangma/self-supervised-depth-completion
ICRA 2019 "Self-supervised Sparse-to-Dense: Self-supervised Depth Completion from LiDAR and...
nburrus/stereodemo
Small Python utility to compare and visualize the output of various stereo depth estimation algorithms
JiawangBian/sc_depth_pl
SC-Depth (V1, V2, and V3) for Unsupervised Monocular Depth Estimation ...
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st...