ArdEngineer/SDR-SLAM
A Resilient SLAM Framework
This system helps autonomous robots and drones reliably understand their surroundings and track their position, even in challenging conditions. It takes live video and motion sensor data, and outputs precise location and mapping information. Robotics engineers and developers building robust autonomous systems would use this to ensure continuous operation.
No commits in the last 6 months.
Use this if you need a visual-inertial simultaneous localization and mapping (SLAM) solution that can operate continuously, adapt to poor lighting, and resist malicious optical interference.
Not ideal if your application does not involve camera and IMU sensor data or if you need a very lightweight solution for resource-constrained embedded systems without GPU support.
Stars
11
Forks
3
Language
C++
License
GPL-3.0
Category
Last pushed
Mar 26, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/ArdEngineer/SDR-SLAM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
changh95/visual-slam-roadmap
Roadmap to become a Visual-SLAM developer in 2026
coperception/coperception
An SDK for multi-agent collaborative perception.
w111liang222/lidar-slam-detection
LSD (LiDAR SLAM & Detection) is an open source perception architecture for autonomous vehicle/robotic
ika-rwth-aachen/Cam2BEV
TensorFlow Implementation for Computing a Semantically Segmented Bird's Eye View (BEV) Image...
lvchuandong/Awesome-Multi-Camera-3D-Occupancy-Prediction
Awesome papers and code about Multi-Camera 3D Occupancy Prediction, such as TPVFormer,...