Modality and Depth in Touchless Smartphone Augmented Reality Interactions
PubDate: June 2020
Teams: Brown University;FX Palo Alto Laboratory Inc;University of Pittsbur
Writers: Jing Qian;David A. Shamma;Daniel Avrahami;Jacob Biehl
PDF: Modality and Depth in Touchless Smartphone Augmented Reality Interactions
Abstract
Augmented reality (AR) on smartphone devices allows people to interact with virtually placed objects anchored in the real world through the device’s viewport. Typically, smartphone AR interactions work with the device’s 2D touchscreen disconnected from the modality and depth of the virtual objects. In this paper, we studied 15 participants’ preferences, performances, and cognitive loads on a set of common tasks performed on smartphones at two interaction depths (close-range and distant) with two touchless modalities (hand tracking and screen dwell). We find that distant AR interactions, strongly preferred by the participants, were significantly faster and took less cognitive effort. We observed that within both interaction depths, modalities performed equally. When designing touchless modalities on smartphones, we suggest using distant interactions when overall performance is the top priority, otherwise, using hand tracking or screen dwell as back-ups for each other can be equally effective.