Ahmednull/L2CS-Net
The official PyTorch implementation of L2CS-Net for gaze estimation and tracking
This project helps you understand where people are looking in real-time. It takes live video feed as input and tells you the precise gaze direction of individuals in the frame. This is useful for researchers, human-computer interaction designers, and anyone analyzing user attention or behavior.
476 stars. No commits in the last 6 months.
Use this if you need to accurately track where a person's eyes are focused in a video, for example, to understand their interaction with a screen or physical object.
Not ideal if you need to detect objects or gestures, or if you only have static images and don't require gaze tracking over time.
Stars
476
Forks
106
Language
Python
License
MIT
Category
Last pushed
Feb 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/Ahmednull/L2CS-Net"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
charliegerard/gaze-detection
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
theankurkedia/blink-detection
Detect the user's blink and wink using machine learning
facemoji/mocap4face
Cross-platform SDK for facial motion capture producing blendshapes and rigid head poses in 3D...
jeeliz/jeelizGlanceTracker
Real-time JavaScript/WebGL library to detect whether the user is looking at the screen....
Hallway-Inc/AvatarWebKit
Web-first SDK that provides real-time ARKit-compatible 52 blend shapes from a camera feed, video...