Now showing 1 - 5 of 5
  • Publication
    NeighboAR: Efficient Object Retrieval using Proximity-and Gaze-based Object Grouping with an AR System
    (ACM, 2024-05-28)
    Aleksandar Slavuljica
    ;
    ; ;
    Humans only recognize a few items in a scene at once and memorize three to seven items in the short term. Such limitations can be mitigated using cognitive offloading (e.g., sticky notes, digital reminders). We studied whether a gaze-enabled Augmented Reality (AR) system could facilitate cognitive offloading and improve object retrieval performance. To this end, we developed NeighboAR, which detects objects in a user's surroundings and generates a graph that stores object proximity relationships and user's gaze dwell times for each object. In a controlled experiment, we asked N=17 participants to inspect randomly distributed objects and later recall the position of a given target object. Our results show that displaying the target together with the proximity object with the longest user gaze dwell time helps recalling the position of the target. Specifically, NeighboAR significantly reduces the retrieval time by 33%, number of errors by 71%, and perceived workload by 10%.
    Type:
    Journal:
    Volume:
    Issue:
  • Publication
    MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources
    (ACM, 2023-09-28) ;
    Khakim Akhunov
    ;
    Federico Carbone
    ;
    ; ; ; ;
    Kasim Sinan Yildirim
    The increasing number of objects in ubiquitous computing environments creates a need for effective object detection and identification mechanisms that permit users to intuitively initiate interactions with these objects. While multiple approaches to such object detection-including through visual object detection, fiducial markers, relative localization, or absolute spatial referencing-are available, each of these suffers from drawbacks that limit their applicability. In this paper, we propose ODIF, an architecture that permits the fusion of object situation information from such heterogeneous sources and that remains vertically and horizontally modular to allow extending and upgrading systems that are constructed accordingly. We furthermore present BLEARVIS, a prototype system that builds on the proposed architecture and integrates computer-vision (CV) based object detection with radio-frequency (RF) angle of arrival (AoA) estimation to identify BLE-tagged objects. In our system, the front camera of a Mixed Reality (MR) head-mounted display (HMD) provides a live image stream to a vision-based object detection module, while an antenna array that is mounted on the HMD collects AoA information from ambient devices. In this way, BLEARVIS is able to differentiate between visually identical objects in the same environment and can provide an MR overlay of information (data and controls) that relates to them. We include experimental evaluations of both, the CV-based object detection and the RF-based AoA estimation, and discuss the applicability of the combined RF and CV pipelines in different ubiquitous computing scenarios. This research can form a starting point to spawn the integration of diverse object detection, identification, and interaction approaches that function across the electromagnetic spectrum, and beyond.
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 2
  • Publication
    EToS-1: Eye Tracking on Shopfloors for User Engagement with Automation
    (CEUR Workshop Proceedings, 2022-04-30) ; ; ;
    Stolze, Markus
    Mixed Reality (MR) is becoming an integral part of many context-aware industrial applications. In maintenance and remote support operations, the individual steps of computer-supported (cooperative) work can be defined and presented to human operators through MR headsets. Tracking of eye movements can provide valuable insights into a user’s decision-making and interaction processes. Thus, our overarching goal is to better understand the visual inspection behavior of machine operators on shopfloors and to find ways to provide them with attention-aware and context-aware assistance through MR headsets that increasingly come with eye tracking (ET) as a default feature. Toward this goal, in two industrial scenarios, we used two mobile eye tracking devices and systematically compared the visual inspection behavior of novice and expert operators. In this paper we present our preliminary findings and lessons learned.
  • Publication
    ShoppingCoach: Using Diminished Reality to Prevent Unhealthy Food Choices in an Offline Supermarket Scenario
    Non-communicable diseases, such as obesity and diabetes, have a significant global impact on health outcomes. While governments worldwide focus on promoting healthy eating, individuals still struggle to follow dietary recommendations. Augmented Reality (AR) might be a useful tool to emphasize specific food products at the point of purchase. However, AR may also add visual clutter to an already complex supermarket environment. Instead, reducing the visual prevalence of unhealthy food products through Diminished Reality (DR) could be a viable alternative: We present Shopping-Coach, a DR prototype that identifies supermarket food products and visually diminishes them dependent on the deviation of the target product’s composition from dietary recommendations. In a study with 12 participants, we found that ShoppingCoach increased compliance with dietary recommendations from 75% to 100% and reduced decision time by 41%. These results demonstrate the promising potential of DR in promoting healthier food choices and thus enhancing public health.
    Scopus© Citations 1
  • Publication
    Personalized Reality: Challenges of Responsible Ubiquitous Personalization
    (Gesellschaft für Informatik e.V., 2024-09-01) ; ;
    The expanding capabilities of Mixed Reality and Ubiquitous Computing technologies enable personalization to be increasingly integrated with physical reality in all areas of people's lives. While such ubiquitous personalization promises more inclusive, efficient, pleasurable, and safer everyday interaction, it may also entail serious societal consequences such as isolated perceptions of reality or a loss of control and agency. We present this paper to initiate a discussion towards the responsible creation of ubiquitous personalization experiences that mitigate these harmful implications while retaining the benefits of personalization. To this end, we present the concept of Personalized Reality (PR) to describe a perceived reality that has been adapted in response to personal user data. We provide avenues for future work, and list open questions and challenges towards the creation of responsible PR experiences.