NeLy-EPFL/DeepFly3D
Motion capture (markerless 3D pose estimation) pipeline and helper GUI for tethered Drosophila.
This project helps neuroscientists and biologists precisely track the 3D movements of tethered fruit flies (Drosophila) during experiments. You provide multi-view video recordings or image sequences of the fly, and it outputs accurate 3D pose estimates, identifying key body points. It's designed for researchers studying insect locomotion, behavior, or neural control in a lab setting.
Use this if you need to analyze the detailed 3D kinematics of tethered Drosophila from multi-camera video footage, without needing physical markers on the fly.
Not ideal if you need to track free-flying insects or other animal species, or if your experimental setup does not involve multiple synchronized camera views.
Stars
96
Forks
17
Language
Jupyter Notebook
License
LGPL-3.0
Category
Last pushed
Mar 07, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/NeLy-EPFL/DeepFly3D"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
DeepLabCut/DeepLabCut
Official implementation of DeepLabCut: Markerless pose estimation of user-defined features with...
openpifpaf/openpifpaf
Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and...
lambdaloop/anipose
🐜🐀🐒🚶 A toolkit for robust markerless 3D pose estimation
DIYer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
NVIDIA-ISAAC-ROS/isaac_ros_pose_estimation
Deep learned, NVIDIA-accelerated 3D object pose estimation