A natural interface for multi-focal plane head mounted displays using 3D gaze
PubDate: May 2014
Teams: German Research Center for Artificial Intelligenc;Osaka University
Writers: Takumi Toyama;Jason Orlosky;Daniel Sonntag;Kiyoshi Kiyokawa
PDF: A natural interface for multi-focal plane head mounted displays using 3D gaze
Abstract
In mobile augmented reality (AR), it is important to develop interfaces for wearable displays that not only reduce distraction, but that can be used quickly and in a natural manner. In this paper, we propose a focal-plane based interaction approach with several advantages over traditional methods designed for head mounted displays (HMDs) with only one focal plane. Using a novel prototype that combines a monoscopic multi-focal plane HMD and eye tracker, we facilitate interaction with virtual elements such as text or buttons by measuring eye convergence on objects at different depths. This can prevent virtual information from being unnecessarily overlaid onto real world objects that are at a different range, but in the same line of sight. We then use our prototype in a series of experiments testing the feasibility of interaction. Despite only being presented with monocular depth cues, users have the ability to correctly select virtual icons in near, mid, and far planes in 98.6% of cases.