Reducing simulator sickness with perceptual camera control

Note: We don't have the ability to review paper

PubDate: November 2019

Teams: Stony Brook University,Adobe Research,Universitá della Svizzera italiana

Writers: Ping Hu;Qi Sun;Piotr Didyk;Li-Yi Wei;Arie E. Kaufman

PDF: Reducing simulator sickness with perceptual camera control

Abstract

Virtual-reality provides an immersive environment but can induce cybersickness due to the discrepancy between visual and vestibular cues. To avoid this problem, the movement of the virtual camera needs to match the motion of the user in the real world. Unfortunately, this is usually difficult due to the mismatch between the size of the virtual environments and the space available to the users in the physical domain. The resulting constraints on the camera movement significantly hamper the adoption of virtual-reality headsets in many scenarios and make the design of the virtual environments very challenging. In this work, we study how the characteristics of the virtual camera movement (e.g., translational acceleration and rotational velocity) and the composition of the virtual environment (e.g., scene depth) contribute to perceived discomfort. Based on the results from our user experiments, we devise a computational model for predicting the magnitude of the discomfort for a given scene and camera trajectory. We further apply our model to a new path planning method which optimizes the input motion trajectory to reduce perceptual sickness. We evaluate the effectiveness of our method in improving perceptual comfort in a series of user studies targeting different applications. The results indicate that our method can reduce the perceived discomfort while maintaining the fidelity of the original navigation, and perform better than simpler alternatives.

You may also like...

Paper