david-wb/gaze-estimation
A deep learning based gaze estimation framework implemented with PyTorch
This tool helps researchers and human-computer interaction (HCI) specialists understand where a person is looking in real-time. It takes live video input from a webcam, processes the eye region, and outputs predictions of eye-region landmarks and the precise direction of gaze (pitch and yaw). This is useful for anyone studying human attention, interaction with interfaces, or developing gaze-controlled systems.
194 stars. No commits in the last 6 months.
Use this if you need to accurately track a user's eye gaze and landmark positions from a live camera feed for research or application development.
Not ideal if you require highly precise, medical-grade eye-tracking or if you don't have access to a GPU for real-time performance.
Stars
194
Forks
37
Language
Jupyter Notebook
License
—
Category
Last pushed
Feb 26, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/david-wb/gaze-estimation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sbobek/tobii-pytracker
Tobii Eyetracker usage and analysis with Python SDK (no Tobii Labs needed)
cpury/lookie-lookie
Learning to track eye movement in the browser
glefundes/mobile-face-gaze
Lightweight gaze estimation with PyTorch.
emilyxxie/mona_lisa_eyes
A machine learning project. Turn on your webcam. Mona Lisa's eyes will follow you around.
manishanis/eye-training
Train your eyes. Read faster.