david-wb/gaze-estimation

A deep learning based gaze estimation framework implemented with PyTorch

39
/ 100
Emerging

This tool helps researchers and human-computer interaction (HCI) specialists understand where a person is looking in real-time. It takes live video input from a webcam, processes the eye region, and outputs predictions of eye-region landmarks and the precise direction of gaze (pitch and yaw). This is useful for anyone studying human attention, interaction with interfaces, or developing gaze-controlled systems.

194 stars. No commits in the last 6 months.

Use this if you need to accurately track a user's eye gaze and landmark positions from a live camera feed for research or application development.

Not ideal if you require highly precise, medical-grade eye-tracking or if you don't have access to a GPU for real-time performance.

human-computer interaction attention tracking eye-tracking research user experience studies gaze-controlled systems
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 21 / 25

How are scores calculated?

Stars

194

Forks

37

Language

Jupyter Notebook

License

Last pushed

Feb 26, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/david-wb/gaze-estimation"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.