Optical distortions in VR bias the perceived slant of moving surfaces

Note: We don't have the ability to review paper

PubDate: December 2020

Teams: York University

Writers: Jonathan Tong; Robert S. Allison; Laurie M. Wilcox

PDF: Optical distortions in VR bias the perceived slant of moving surfaces


The magnifying optics of virtual reality (VR) head-mounted displays (HMD) often cause undesirable pincushion distortion in the displayed imagery. Eccentrically increasing magnification radially displaces image-points away from the optical axis, causing straight lines to curve outwards. This, in turn, should affect the 3D perception of surface shape by warping binocular and monocular depth cues. Previous research has shown that distortion-induced biases in perceived slant do occur in static images. However, most use cases in VR involve moving images. Here we evaluate the impact of motion on biases in perceived slant. An HMD was used to present flat, textured surfaces that varied in slant and were either stationary, or translated laterally by the observer. In separate studies we varied the degree of distortion and evaluated the impact on perceived slant at several locations along the surface. We found that, irrespective of whether the surface was moving or stationary, distortion introduced significant bias into local slant estimates. The pattern of results is consistent with the surface appearing to be concave (as if viewing the inside surface of a bowl), as predicted from the warping of binocular and monocular cues. Importantly, the intermediate distortion level produced the same, but weaker, pattern of biases seen in the fully-distorted condition. When an appropriate level of pre-warping was applied, slant perception was veridical. Overall, our results highlight the importance of sufficiently correcting for optical distortions in VR HMDs to enable veridical perception of surface attitude.

You may also like...