abs711/The-way-of-the-future
A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, different use cases of the dataset along with example code.
This dataset provides a rich collection of information about how people move and see in everyday environments. It includes recordings from head-mounted cameras (what a person sees), their eye movements, and detailed body motion data. Researchers in fields like human-computer interaction, robotics, or cognitive science can use this data to understand and model human behavior in real-world settings.
No commits in the last 6 months.
Use this if you need extensive real-world human motion, vision, and eye-tracking data for research into human behavior, perception, or developing motion-aware systems.
Not ideal if you require a simple dataset for static image analysis or controlled laboratory motion capture without egocentric vision components.
Stars
12
Forks
1
Language
Python
License
MIT
Category
Last pushed
Nov 05, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/abs711/The-way-of-the-future"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
DeepLabCut/DeepLabCut
Official implementation of DeepLabCut: Markerless pose estimation of user-defined features with...
openpifpaf/openpifpaf
Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and...
lambdaloop/anipose
🐜🐀🐒🚶 A toolkit for robust markerless 3D pose estimation
DIYer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
NeLy-EPFL/DeepFly3D
Motion capture (markerless 3D pose estimation) pipeline and helper GUI for tethered Drosophila.