Detection of Scaled Hand Interactions in Virtual Reality: The Effects of Motion Direction and Task Complexity
PubDate: May 2020
Teams: University of Florida
Writers: Shaghayegh Esmaeili; Brett Benda; Eric D. Ragan
In virtual reality (VR), natural physical hand interaction allows users to interact with virtual content using physical gestures. While the most straightforward use of tracked hand motion maintains a one-to-one mapping between the physical and virtual world, some cases might benefit from changing this mapping through scaled or redirected interactions that modify the mapping between user’s physical movements and the magnitude of corresponding virtual movements. However, large deviations in interaction fidelity may potentially provide distractions or a loss of perceived realism. Therefore, it is important to know the extent to which remapping techniques can be applied to scaled interactions in VR without users detecting the difference. In this paper, we extend prior research on redirected hand techniques by investigating user perception of scaled hand movements and estimating detection thresholds for different types of hand motion in VR. We conducted two experiments with a two-alternative forced-choice (2AFC) design to estimate the detection thresholds of remapped interaction. The first experiment tested the perception of motion scaling for simple hand movements, and the second experiment involved more complex reaching motions in a cognitively demanding game scenario. We present estimated detection thresholds for scale values that can be applied to virtual hand movements without users noticing the difference. Our findings show that detection thresholds differ significantly based on the type of hand movement (horizontal, vertical, and depth).