Gaze Comes in Handy: Predicting and Preventing Erroneous Hand Actions in AR-Supported Manual Tasks
Metadata only
Date
2021Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Emerging Augmented Reality headsets incorporate gaze and hand tracking and can, thus, observe the user’s behavior without interfering with ongoing activities. In this paper, we analyze hand-eye coordination in real-time to predict hand actions during target selection and warn users of potential errors before they occur. In our first user study, we recorded 10 participants playing a memory card game, which involves frequent hand-eye coordination with little task-relevant information. We found that participants’ gaze locked onto target cards 350ms before the hands touched them in 73.3% of all cases, which coincided with the peak velocity of the hand moving to the target. Based on our findings, we then introduce a closed-loop support system that monitors the user’s fingertip position to detect the first card turn and analyzes gaze, hand velocity and trajectory to predict the second card before it is turned by the user. In a second study with 12 participants, our support system correctly displayed color-coded visual alerts in a timely manner with an accuracy of 85.9%. The results indicate the high value of eye and hand tracking features for behavior prediction and provide a first step towards predictive real-time user support. Show more
Publication status
publishedExternal links
Book title
2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)Pages / Article No.
Publisher
IEEEEvent
Subject
Human-centered computing; Human computer interaction (HCI); Interaction paradigms; Mixed / augmented realityOrganisational unit
03943 - Meboldt, Mirko / Meboldt, Mirko
09649 - Holz, Christian / Holz, Christian
More
Show all metadata
ETH Bibliography
yes
Altmetrics