4DVLab/Vision-Centric-BEV-Perception
Vision-Centric BEV Perception: A Survey
This project offers a comprehensive survey of methods for converting standard camera views into a 'bird's-eye view' (BEV), which is crucial for understanding the surroundings in autonomous vehicles. It takes in images from vehicle-mounted cameras and provides a unified, top-down perspective of the scene, highlighting objects and their positions. Autonomous vehicle engineers and researchers designing perception systems for self-driving cars would use this resource.
737 stars. No commits in the last 6 months.
Use this if you are developing or researching autonomous driving systems and need to understand the various techniques for transforming camera images into a bird's-eye view for object detection, segmentation, and planning.
Not ideal if you are looking for a ready-to-use software library or tool for immediate deployment, as this is a survey of research papers rather than an implementation.
Stars
737
Forks
72
Language
—
License
—
Category
Last pushed
Sep 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/4DVLab/Vision-Centric-BEV-Perception"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
changh95/visual-slam-roadmap
Roadmap to become a Visual-SLAM developer in 2026
coperception/coperception
An SDK for multi-agent collaborative perception.
w111liang222/lidar-slam-detection
LSD (LiDAR SLAM & Detection) is an open source perception architecture for autonomous vehicle/robotic
ika-rwth-aachen/Cam2BEV
TensorFlow Implementation for Computing a Semantically Segmented Bird's Eye View (BEV) Image...
lvchuandong/Awesome-Multi-Camera-3D-Occupancy-Prediction
Awesome papers and code about Multi-Camera 3D Occupancy Prediction, such as TPVFormer,...