UkcheolShin/ThermalMonoDepth

Official implementation of the paper "Maximizing Self-supervision from Thermal Image for Effective Self-supervised Learning of Depth and Ego-motion"

32
/ 100
Emerging

This project helps roboticists and autonomous vehicle developers accurately understand their environment using only thermal camera footage, especially in challenging conditions like darkness or bad weather. It takes a sequence of thermal images and outputs detailed depth maps of the scene and the precise movement (ego-motion) of the camera. This is for engineers and researchers building autonomous systems that need robust perception without relying on visible light cameras or other sensors like LiDAR.

No commits in the last 6 months.

Use this if you need to determine object distances and camera movement solely from thermal video in low-light, zero-light, or adverse weather conditions.

Not ideal if your application primarily uses standard RGB cameras or requires real-time processing on embedded systems without a dedicated GPU.

autonomous-navigation robotics-perception thermal-imaging depth-estimation pose-estimation
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 17 / 25

How are scores calculated?

Stars

40

Forks

8

Language

Python

License

Last pushed

Jul 25, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/UkcheolShin/ThermalMonoDepth"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.