Charmve/autopilot-perception

End to End Autopilot Perception Playbook

46
/ 100
Emerging

This project serves as a comprehensive playbook for understanding and building perception systems in autonomous driving. It takes raw data from vehicle cameras and LiDAR sensors, processes it through various algorithms, and outputs crucial information like detected pedestrians, vehicles, lane lines, and traversable areas. This resource is designed for engineers, researchers, or even career-changers looking to gain practical, hands-on experience in autonomous driving perception.

128 stars. No commits in the last 6 months.

Use this if you want to understand the entire pipeline of an autonomous driving perception system, from sensor input to model deployment, and gain practical experience with real-world algorithms.

Not ideal if you are looking for a high-level overview without getting into the technical details and hands-on implementation of perception algorithms.

autonomous-driving vehicle-perception sensor-fusion object-detection computer-vision
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

128

Forks

26

Language

HTML

License

GPL-3.0

Last pushed

Oct 07, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Charmve/autopilot-perception"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.