tub-rip/DERD-Net

DERD-Net: Learning Depth from Event-based Ray Densities (NeurIPS 2025 Spotlight)

27
/ 100
Experimental

This project helps engineers and researchers working with event cameras to accurately determine the distance of objects in dynamic environments. It takes event camera data, often from drones or autonomous vehicles, and transforms it into detailed pixel-by-pixel depth maps. This is useful for anyone building systems that need precise spatial awareness, like for drone navigation or robot localization.

Use this if you need highly accurate, real-time depth measurements from event-based cameras, especially in scenarios where traditional cameras struggle due to high speed or low light.

Not ideal if your primary data source is standard frame-based cameras or if you require depth information without the specialized input from event cameras.

robotics autonomous-vehicles drone-navigation computer-vision depth-estimation
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 15 / 25
Community 0 / 25

How are scores calculated?

Stars

16

Forks

Language

Jupyter Notebook

License

MIT

Last pushed

Nov 22, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/tub-rip/DERD-Net"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.