Duke-I3T-Lab/AR_CPR_SA
The official code repository for the ISMAR 2025 paper "Will You Be Aware? Eye Tracking-Based Modeling of Situational Awareness in Augmented Reality"
This project helps medical trainers and researchers understand how well people maintain situational awareness during augmented reality (AR) guided cardiopulmonary resuscitation (CPR). By analyzing eye-tracking data from AR devices like Magic Leap 2, it can predict if a user has good or poor awareness when unexpected incidents occur, such as a patient bleeding or vomiting. This is useful for evaluating and improving AR training systems for emergency medical procedures.
Use this if you are a researcher or medical educator studying human performance and situational awareness in AR-guided medical training scenarios, particularly for CPR.
Not ideal if you need a plug-and-play solution for real-time patient monitoring or a general AR development kit, as it is focused on research modeling of awareness during training.
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Nov 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/Duke-I3T-Lab/AR_CPR_SA"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
andyzeng/apc-vision-toolbox
MIT-Princeton Vision Toolbox for the Amazon Picking Challenge 2016 - RGB-D ConvNet-based object...
OSU-NLP-Group/UGround
[ICLR'25 Oral] UGround: Universal GUI Visual Grounding for GUI Agents
Ewenwan/MVision
机器人视觉 移动机器人 VS-SLAM ORB-SLAM2 深度学习目标检测 yolov3 行为检测 opencv PCL 机器学习 无人驾驶
leggedrobotics/wild_visual_navigation
Wild Visual Navigation: A system for fast traversability learning via pre-trained models and...
microsoft/event-vae-rl
Visuomotor policies from event-based cameras through representation learning and reinforcement...