zjkhurry/stereopsis-anything
Stereo Anything can convert 2D content on the screen in real-time into stereoscopic images (spatial videos) that are theoretically compatible with various AR/VR glasses, such as Rayneo Air 1s/2s, X1, X2, Nreal Air, etc.
This project helps you instantly turn any 2D content on your computer screen into a 3D (stereoscopic) video. It takes whatever is displayed on your screen, like a regular video or application, and outputs a spatial video format compatible with various AR/VR glasses. This is ideal for anyone who wants to experience their existing digital content in an immersive, three-dimensional way.
No commits in the last 6 months.
Use this if you want to watch standard 2D videos, browse websites, or use applications in 3D through your AR/VR glasses, particularly if you have a macOS device.
Not ideal if you primarily use older hardware, require extremely low latency for fast-paced gaming, or need to generate stereoscopic content for professional 3D production workflows.
Stars
51
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Oct 19, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/zjkhurry/stereopsis-anything"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vita-epfl/monoloco
A 3D vision library from 2D keypoints: monocular and stereo 3D detection for humans, social...
fangchangma/self-supervised-depth-completion
ICRA 2019 "Self-supervised Sparse-to-Dense: Self-supervised Depth Completion from LiDAR and...
nburrus/stereodemo
Small Python utility to compare and visualize the output of various stereo depth estimation algorithms
JiawangBian/sc_depth_pl
SC-Depth (V1, V2, and V3) for Unsupervised Monocular Depth Estimation ...
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st...