Item Type |
Conference or Workshop Item
(Paper)
|
Abstract |
Mixed Reality (MR) is becoming an integral part of many context-aware industrial applications. In maintenance and remote support operations, the individual steps of computer-supported (cooperative) work can be defined and presented to human operators through MR headsets. Tracking of eye movements can provide valuable insights into a user’s decision-making and interaction processes. Thus, our overarching goal is to better understand the visual inspection behavior of machine operators on shopfloors and to find ways to provide them with attention-aware and context-aware assistance through MR headsets that increasingly come with eye tracking (ET) as a default feature. Toward this goal, in two industrial scenarios, we used two mobile eye tracking devices and systematically compared the visual inspection behavior of novice and expert operators. In this paper we present our preliminary findings and lessons learned. |
Authors |
Bektas, Kenan; Strecker, Jannis Rene; Mayer, Simon & Stolze, Markus |
Research Team |
https://ics.unisg.ch/chair-interactions-mayer/ |
Language |
English |
Keywords |
eye tracking, mixed reality, industrial operations, CSCW, automation, user engagement |
Subjects |
computer science |
HSG Classification |
contribution to scientific community |
Date |
30 April 2022 |
Publisher |
CEUR Workshop Proceedings |
Event Title |
AutomationXP22: Engaging with Automation, CHI'22 |
Event Location |
New Orleans, LA |
Event Dates |
30 April 2022 |
Contact Email Address |
kenan.bekas@unisg.ch |
Depositing User |
Dr. Kenan Bektas
|
Date Deposited |
24 May 2022 09:57 |
Last Modified |
24 May 2022 09:57 |
URI: |
https://www.alexandria.unisg.ch/publications/266339 |