nburrus/stereodemo
Small Python utility to compare and visualize the output of various stereo depth estimation algorithms
This tool helps researchers and engineers evaluate different computer vision models for depth perception. You feed it a pair of left and right images (or use a live camera feed), and it generates 3D point clouds and visualizations, allowing you to see how well each algorithm estimates depth. It's designed for practitioners working with 3D sensing, robotics, or autonomous systems.
173 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to quickly compare the performance of various stereo depth estimation algorithms on your own images or live camera data to choose the best one for your application.
Not ideal if you're looking to develop new stereo depth estimation algorithms or need a solution for monocular depth estimation exclusively.
Stars
173
Forks
14
Language
Python
License
MIT
Category
Last pushed
Apr 23, 2025
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/nburrus/stereodemo"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vita-epfl/monoloco
A 3D vision library from 2D keypoints: monocular and stereo 3D detection for humans, social...
fangchangma/self-supervised-depth-completion
ICRA 2019 "Self-supervised Sparse-to-Dense: Self-supervised Depth Completion from LiDAR and...
JiawangBian/sc_depth_pl
SC-Depth (V1, V2, and V3) for Unsupervised Monocular Depth Estimation ...
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st...
antocad/FocusOnDepth
A Monocular depth-estimation for in-the-wild AutoFocus application.