Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays

Note: We don't have the ability to review paper

PubDate: September 2020

Teams: Bonn-Rhein-Sieg University of Applied Sciences;University of Bremen

Writers: Alexander Marquardt; Christina Trepkowski; Tom David Eibich; Jens Maiero; Ernst Kruijff; Johannes Schöning

PDF: Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays

Abstract

Current augmented reality displays still have a very limited field of view compared to the human vision. In order to localize out-of-view objects, researchers have predominantly explored visual guidance approaches to visualize information in the limited (in-view) screen space. Unfortunately, visual conflicts like cluttering or occlusion of information often arise, which can lead to search performance issues and a decreased awareness about the physical environment. In this paper, we compare an innovative non-visual guidance approach based on audio-tactile cues with the state-of-the-art visual guidance technique EyeSee360 for localizing out-of-view objects in augmented reality displays with limited field of view. In our user study, we evaluate both guidance methods in terms of search performance and situation awareness. We show that although audio-tactile guidance is generally slower than the well-performing EyeSee360 in terms of search times, it is on a par regarding the hit rate. Even more so, the audio-tactile method provides a significant improvement in situation awareness compared to the visual approach.

You may also like...

Paper