abs711/The-way-of-the-future

A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, different use cases of the dataset along with example code.

27
/ 100
Experimental

This dataset provides a rich collection of information about how people move and see in everyday environments. It includes recordings from head-mounted cameras (what a person sees), their eye movements, and detailed body motion data. Researchers in fields like human-computer interaction, robotics, or cognitive science can use this data to understand and model human behavior in real-world settings.

No commits in the last 6 months.

Use this if you need extensive real-world human motion, vision, and eye-tracking data for research into human behavior, perception, or developing motion-aware systems.

Not ideal if you require a simple dataset for static image analysis or controlled laboratory motion capture without egocentric vision components.

human-behavior-research robotics gait-analysis cognitive-science egocentric-vision
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

12

Forks

1

Language

Python

License

MIT

Last pushed

Nov 05, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/abs711/The-way-of-the-future"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.