nburrus/stereodemo

Small Python utility to compare and visualize the output of various stereo depth estimation algorithms

49
/ 100
Emerging

This tool helps researchers and engineers evaluate different computer vision models for depth perception. You feed it a pair of left and right images (or use a live camera feed), and it generates 3D point clouds and visualizations, allowing you to see how well each algorithm estimates depth. It's designed for practitioners working with 3D sensing, robotics, or autonomous systems.

173 stars. No commits in the last 6 months. Available on PyPI.

Use this if you need to quickly compare the performance of various stereo depth estimation algorithms on your own images or live camera data to choose the best one for your application.

Not ideal if you're looking to develop new stereo depth estimation algorithms or need a solution for monocular depth estimation exclusively.

3D-sensing robotics computer-vision depth-estimation autonomous-systems
Stale 6m
Maintenance 2 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 12 / 25

How are scores calculated?

Stars

173

Forks

14

Language

Python

License

MIT

Last pushed

Apr 23, 2025

Commits (30d)

0

Dependencies

7

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/nburrus/stereodemo"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.