Now showing 1 - 5 of 5
  • Publication
    NeighboAR: Efficient Object Retrieval using Proximity-and Gaze-based Object Grouping with an AR System
    (ACM, 2024-05-28)
    Aleksandar Slavuljica
    ;
    ; ;
    Humans only recognize a few items in a scene at once and memorize three to seven items in the short term. Such limitations can be mitigated using cognitive offloading (e.g., sticky notes, digital reminders). We studied whether a gaze-enabled Augmented Reality (AR) system could facilitate cognitive offloading and improve object retrieval performance. To this end, we developed NeighboAR, which detects objects in a user's surroundings and generates a graph that stores object proximity relationships and user's gaze dwell times for each object. In a controlled experiment, we asked N=17 participants to inspect randomly distributed objects and later recall the position of a given target object. Our results show that displaying the target together with the proximity object with the longest user gaze dwell time helps recalling the position of the target. Specifically, NeighboAR significantly reduces the retrieval time by 33%, number of errors by 71%, and perceived workload by 10%.
    Type:
    Journal:
    Volume:
    Issue:
  • Publication
    GEAR: Gaze-enabled augmented reality for human activity recognition
    (ACM, 2023-05-30) ; ; ; ;
    Hermann, Jonas
    ;
    Jenss, Kay Erik
    ;
    ;
    Soler, Marc Elias
    Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they allow novel interaction methods and provide insights into user attention, intentions, and activities. However, only few studies have used gaze-enabled AR displays for human activity recognition (HAR). In an experimental study, we collected gaze data from 10 users on a HoloLens 2 (HL2) while they performed three activities (i.e., read, inspect, search). We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved an up to 98.7% activity-recognition accuracy. On the HL2, we provided users with an AR feedback that is relevant to their current activity. We present the components of our system (GEAR) including a novel solution to enable the controlled sharing of collected data. We provide the scripts and anonymized datasets which can be used as teaching material in graduate courses or for reproducing our findings.
  • Publication
    EToS-1: Eye Tracking on Shopfloors for User Engagement with Automation
    (CEUR Workshop Proceedings, 2022-04-30) ; ; ;
    Stolze, Markus
    Mixed Reality (MR) is becoming an integral part of many context-aware industrial applications. In maintenance and remote support operations, the individual steps of computer-supported (cooperative) work can be defined and presented to human operators through MR headsets. Tracking of eye movements can provide valuable insights into a user’s decision-making and interaction processes. Thus, our overarching goal is to better understand the visual inspection behavior of machine operators on shopfloors and to find ways to provide them with attention-aware and context-aware assistance through MR headsets that increasingly come with eye tracking (ET) as a default feature. Toward this goal, in two industrial scenarios, we used two mobile eye tracking devices and systematically compared the visual inspection behavior of novice and expert operators. In this paper we present our preliminary findings and lessons learned.
  • Publication
    GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work
    Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.
  • Publication
    Pupillometry for Measuring User Response to Movement of an Industrial Robot
    ( 2023-05-30)
    Damian Hostettler
    ;
    ;
    Interactive systems can adapt to individual users to increase productivity, safety, or acceptance. Previous research focused on different factors, such as cognitive workload (CWL), to better understand and improve the human-computer or human-robot interaction (HRI). We present results of an HRI experiment that uses pupillometry to measure users' responses to robot movements. Our results demonstrate a significant change in pupil dilation, indicating higher CWL, as a result of increased movement speed of an articulated robot arm. This might permit improved interaction ergonomics by adapting the behavior of robots or other devices to individual users at run time. CCS CONCEPTS • Human-centered computing → Ubiquitous and mobile computing systems and tools.