IRCVLab/Depth-Anything-for-Jetson-Orin
Real-time Depth Estimation for Jetson Orin
This project helps engineers and researchers working with robotics, autonomous vehicles, or augmented reality by providing real-time depth perception. It takes live video from a CSI camera connected to an NVIDIA Jetson Orin device and produces a detailed depth map, showing how far away objects are in the scene. This allows for immediate environmental understanding in applications where spatial awareness is critical.
Use this if you need to equip an NVIDIA Jetson Orin device with the ability to understand 3D space from a live camera feed, for applications like navigation or object interaction.
Not ideal if you are not using an NVIDIA Jetson Orin device or if your application requires a different type of depth sensing, such as from a LiDAR sensor.
Stars
65
Forks
13
Language
Python
License
—
Category
Last pushed
Nov 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/IRCVLab/Depth-Anything-for-Jetson-Orin"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vita-epfl/monoloco
A 3D vision library from 2D keypoints: monocular and stereo 3D detection for humans, social...
fangchangma/self-supervised-depth-completion
ICRA 2019 "Self-supervised Sparse-to-Dense: Self-supervised Depth Completion from LiDAR and...
nburrus/stereodemo
Small Python utility to compare and visualize the output of various stereo depth estimation algorithms
JiawangBian/sc_depth_pl
SC-Depth (V1, V2, and V3) for Unsupervised Monocular Depth Estimation ...
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st...