Eyes-free Target Acquisition During Walking in Immersive Mixed Reality

Note: We don't have the ability to review paper

PubDate: September 2020

Teams: The University of Melbourne

Writers: Qiushi Zhou; Difeng Yu; Martin N Reinoso; Joshua Newn; Jorge Goncalves; Eduardo Velloso

PDF: Eyes-free Target Acquisition During Walking in Immersive Mixed Reality

Abstract

Reaching towards out-of-sight objects during walking is a common task in daily life, however the same task can be challenging when wearing immersive Head-Mounted Displays (HMD). In this paper, we investigate the effects of spatial reference frame, walking path curvature, and target placement relative to the body on user performance of manually acquiring out-of-sight targets located around their bodies, as they walk in a spatial-mapping Mixed Reality (MR) environment wearing an immersive HMD. We found that walking and increased path curvature negatively affected the overall spatial accuracy of the performance, and that the performance benefited more from using the torso as the reference frame than the head. We also found that targets placed at maximum reaching distance yielded less error in angular rotation and depth of the reaching arm. We discuss our findings with regard to human walking kinesthetics and the sensory integration in the peripersonal space during locomotion in immersive MR. We provide design guidelines for future immersive MR experience featuring spatial mapping and full-body motion tracking to provide better embodied experience.

You may also like...

Paper