Exploring Visuo-haptic Feedback Congruency in Virtual Reality
PubDate: October 2020
Teams: University of Lincoln
Writers: Benjamin Williams; Alexandra E. Garton; Christopher J. Headleand
Visuo-haptic feedback is an important aspect of virtual reality experiences, with several previous works investigating its benefits and effects. A key aspect of this domain is congruency of crossmodal feedback and how it affects users. However, an important sub-domain which has received surprisingly little focus is visuo-haptic congruency in an interactive multisensory setting. This is especially important given that multisensory integration is crucial to player immersion in the context of virtual reality video games. In this paper, we attempt to address this lack of research. To achieve this, a total of 50 participants played a virtual reality racing game with either congruent or incongruent visuo-haptic feedback. Specifically, these users engaged in a driving simulator with physical gear shift interfaces, with one treatment group using a stick-shift gearbox, and the other using a paddle-shift setup. The virtual car they drove (A Formula Rookie race car) was only visually congruent with the stick-shift setup. A motion simulator was also used to provide synchronous vestibular cues and diversify the range of modalities in multisensory integration. The racing simulator used was Project CARS 2, one of the world’s most popular commercial racing simulators. Our findings showed no significant differences between the groups in measures of user presence or in-game performance, counter to previous work regarding visuo-haptic congruency. However, the Self-evaluation of Performance PQ subscale was notably close to significance. Our results can be used to better inform games and simulation developers, especially those targeting virtual reality.