Boundary conditions for information visualization with respect to the user’s gaze

Note: We don't have the ability to review paper

PubDate: March 2014

Teams: Technische Universität München, Garching b. München

Writers: Marcus Tönnis;Gudrun Klinker

PDF: Boundary conditions for information visualization with respect to the user’s gaze

Abstract

Gaze tracking in Augmented Reality is mainly used to trigger buttons and access information. Such selectable objects are usually placed in the world or in screen coordinates of a head- or hand-mounted display. Yet, no work has investigated options to place information with respect to the line of sight.

This work presents our first steps towards gaze-mounted information visualization and interaction, determining boundary conditions for such an approach. We propose a general concept for information presentation at an angular offset to the line of sight. A user can look around freely, yet having information attached nearby the line of sight. Whenever the user wants to look at the information and does so, the information is placed directly at the axis of sight for a short time.

Based on this concept we investigate how users understand frames of reference, specifically, if users relate directions and alignments in head or world coordinates. We further investigate if information may have a preferred motion behavior. Prototypical implementations of three variants are presented to users in guided interviews. The three variants resemble a rigid offset and two different floating motion behaviors of the information. Floating algorithms implement an inertia based model and either allow the user’s gaze to surpass the information or to push information with the gaze. Testing our proto-types yielded findings that users strongly prefer information maintaining world-relation and that less extra motion is preferred.

You may also like...

Paper