arclab-hku/DEIO
(ICCV2025) Learning-based Event-Inertial Odometry
This project helps robots and autonomous systems accurately determine their location and orientation, even in challenging conditions like rapid movement or extreme lighting. It takes in data from specialized "event cameras" (which detect changes in brightness) and inertial measurement units (IMUs) and outputs precise 3D position and orientation estimates. Roboticists, drone developers, and autonomous vehicle engineers would use this for robust navigation.
100 stars.
Use this if you need highly accurate and robust motion tracking for autonomous systems using event cameras and IMUs, especially in demanding environments.
Not ideal if your application primarily relies on standard frame-based cameras or if you don't require the specialized capabilities of event cameras.
Stars
100
Forks
11
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Oct 29, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/arclab-hku/DEIO"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
changh95/visual-slam-roadmap
Roadmap to become a Visual-SLAM developer in 2026
coperception/coperception
An SDK for multi-agent collaborative perception.
w111liang222/lidar-slam-detection
LSD (LiDAR SLAM & Detection) is an open source perception architecture for autonomous vehicle/robotic
ika-rwth-aachen/Cam2BEV
TensorFlow Implementation for Computing a Semantically Segmented Bird's Eye View (BEV) Image...
lvchuandong/Awesome-Multi-Camera-3D-Occupancy-Prediction
Awesome papers and code about Multi-Camera 3D Occupancy Prediction, such as TPVFormer,...