Rishikesh-Jadhav/Birds-Eye-View-Trajectory-Prediction-for-Autonomous-Driving

This repository contains our work on a comprehensive investigation on motion prediction for Autonomous Vehicles using the PowerBEV framework and a Multi-Camera setup. Validated trajectory forecasting capabilities on the NuScenes, Woven and Argoverse datasets and identified challenges in model generalization across these datasets.

28
/ 100
Experimental

This project helps automotive engineers and researchers developing autonomous driving systems to predict vehicle and pedestrian movements. It takes raw perception data from multiple cameras and LiDAR sensors (like those found in self-driving cars) and outputs detailed, bird's-eye-view forecasts of where other road users will be in the near future. This is crucial for planning safe and efficient autonomous vehicle maneuvers.

No commits in the last 6 months.

Use this if you need to accurately forecast the trajectories of moving objects around an autonomous vehicle using multi-camera and LiDAR sensor data.

Not ideal if you are looking for a solution that generalizes perfectly across all autonomous driving datasets without any fine-tuning or adaptation.

autonomous-driving motion-prediction robotics computer-vision sensor-fusion
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Python

License

Last pushed

May 05, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Rishikesh-Jadhav/Birds-Eye-View-Trajectory-Prediction-for-Autonomous-Driving"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.