Virtual Reality Sickness Predictor: Analysis of visual-vestibular conflict and VR contents

Note: We don't have the ability to review paper

PubDate: September 2018

Teams: Yonsei University

Writers: Jaekyung Kim; Woojae Kim; Sewoong Ahn; Jinwoo Kim; Sanghoon Lee

PDF: Virtual Reality Sickness Predictor: Analysis of visual-vestibular conflict and VR contents

Abstract

Predicting the degree of sickness is an imperative goal to guarantee viewing safety when watching virtual reality (VR) contents. Ideally, such predictive models should be explained in terms of the human visual system (HVS). When viewing VR contents using a head mounted display (HMD), there is a conflict between user’s actual motion and visually perceived motion. This results in an unnatural visual-vestibular sensory mismatch that causes side effects such as onset of nausea, oculomotor, disorientation, asthenopia (eyestrain). In this paper, we propose a framework called VR sickness predictor (VRSP) using the interaction model between user’s motion and the vestibular system. VRSP extracts two types of features: a) perceptual motion feature through a visual-vestibular interaction model, and b) statistical content feature that affects user motion perception. Furthermore, we build a VR sickness database including 36 virtual scenes to evaluate the performance of VRSP. Through rigorous experiments, we demonstrate that the correlation between the proposed model and the subjective sickness score yields ~ 72 %.

You may also like...

Paper