jac99/Egonn
EgoNN: Egocentric Neural Network for Point Cloud Based 6DoF Relocalization at the City Scale
This project helps autonomous vehicles, robots, and mapping systems accurately determine their precise location and orientation (6DoF relocalization) within a city-scale environment. It takes 3D point cloud data, typically from a LiDAR sensor, and outputs a highly accurate pose (position and orientation) estimate. This is useful for engineers and researchers developing navigation systems that rely on detailed environmental understanding.
No commits in the last 6 months.
Use this if you need to precisely localize a vehicle or robot in a large-scale, complex outdoor environment using 3D LiDAR point clouds.
Not ideal if your application doesn't involve LiDAR point clouds or requires real-time, ultra-low-power localization on constrained hardware.
Stars
65
Forks
8
Language
Python
License
MIT
Category
Last pushed
Mar 03, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/jac99/Egonn"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
changh95/visual-slam-roadmap
Roadmap to become a Visual-SLAM developer in 2026
coperception/coperception
An SDK for multi-agent collaborative perception.
w111liang222/lidar-slam-detection
LSD (LiDAR SLAM & Detection) is an open source perception architecture for autonomous vehicle/robotic
ika-rwth-aachen/Cam2BEV
TensorFlow Implementation for Computing a Semantically Segmented Bird's Eye View (BEV) Image...
lvchuandong/Awesome-Multi-Camera-3D-Occupancy-Prediction
Awesome papers and code about Multi-Camera 3D Occupancy Prediction, such as TPVFormer,...