An Evaluation of Screen Parallax, Haptic Feedback, and Sensory-Motor Mismatch on Near-Field Perception-Action Coordination in VR
PubDate: October 2021
Teams: Clemson University
Writers: David Brickler;Sabarish V. Babu
Abstract
Virtual reality (VR) displays have factors such as vergence-accommodation conflicts that negatively impact depth perception and cause users to misjudge distances to select objects. In addition, popular large-screen immersive displays present the depth of any target rendered through screen parallax information of points, which are encapsulated within stereoscopic voxels that are a distinct unit of space dictating how far an object is placed in front of or behind the screen. As they emanate from the viewers’ eyes (left and right center of projection), the density of voxels is higher in front of the screen (in regions of negative screen parallax) than it is behind the screen (in regions of positive screen parallax), implying a higher spatial resolution of depth in front of the screen than behind the screen. Our experiment implements a near-field fine-motor pick-and-place task in which users pick up a ring and place it around a targeted peg. The targets are arranged in a linear configuration of 3, 5, and 7 pegs along the front-and-back axis with the center peg placed in the same depth as the screen. We use this to evaluate how users manipulate objects in positive versus negative screen parallax space by the metrics of efficiency, accuracy, and economy of movement. In addition, we evaluate how users’ performance is moderated by haptic feedback and mismatch between visual and proprioceptive information. Our results reveal that users perform more efficiently in negative screen parallax space and that haptic feedback and visuo-proprioceptive mismatch have effects on placement efficiency. The implications of these findings are described in the later sections of the article.