LTTM/FlyAwareV2
Repository containing all the code used to generate and process the FlyAware dataset
This project provides a comprehensive dataset for training computer vision models that understand urban scenes from drone footage. It offers a large collection of drone images, depth maps, and semantic labels, covering various weather conditions and altitudes. Researchers and engineers developing autonomous drone navigation or urban planning systems would use this to build more robust and accurate AI.
Use this if you need a large, diverse dataset to train AI models for drone-based urban scene understanding, especially when considering challenging weather conditions like fog, rain, or night.
Not ideal if your primary focus is on non-urban environments, or if you only require simple image classification without detailed semantic understanding.
Stars
11
Forks
1
Language
Python
License
GPL-3.0
Category
Last pushed
Jan 16, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/LTTM/FlyAwareV2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
DeepTrackAI/DeepTrack2
DeepTrack2 is a modular Python library for generating, manipulating, and analyzing image data...
abhineet123/Deep-Learning-for-Tracking-and-Detection
Collection of papers, datasets, code and other resources for object tracking and detection using...
NVIDIA-ISAAC-ROS/isaac_ros_dnn_inference
NVIDIA-accelerated DNN model inference ROS 2 packages using NVIDIA Triton/TensorRT for both...
DagnyT/hardnet
Hardnet descriptor model - "Working hard to know your neighbor's margins: Local descriptor...
rafellerc/Pytorch-SiamFC
Pytorch implementation of "Fully-Convolutional Siamese Networks for Object Tracking"