arclab-hku/DEIO

(ICCV2025) Learning-based Event-Inertial Odometry

44
/ 100
Emerging

This project helps robots and autonomous systems accurately determine their location and orientation, even in challenging conditions like rapid movement or extreme lighting. It takes in data from specialized "event cameras" (which detect changes in brightness) and inertial measurement units (IMUs) and outputs precise 3D position and orientation estimates. Roboticists, drone developers, and autonomous vehicle engineers would use this for robust navigation.

100 stars.

Use this if you need highly accurate and robust motion tracking for autonomous systems using event cameras and IMUs, especially in demanding environments.

Not ideal if your application primarily relies on standard frame-based cameras or if you don't require the specialized capabilities of event cameras.

robot-navigation autonomous-vehicles drone-guidance SLAM sensor-fusion
No Package No Dependents
Maintenance 6 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

100

Forks

11

Language

Jupyter Notebook

License

MIT

Last pushed

Oct 29, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/arclab-hku/DEIO"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.