Scaling Inertial Forces to Alter Weight Perception in Virtual Reality
PubDate: September 2018
Teams: Stanford University
Writers: Jacob M. Suchoski; Susana Martinez; Allison M. Okamura
As the field of haptics in virtual reality expands, wearable devices are being explored as alternatives to traditional kinesthetic force feedback devices, which are often limited in workspace. Skin deformation feedback offers a user-grounded feedback modality that mimics cutaneous interactions with the real world but can suffer from force-output saturation due to the actuation constraints required to achieve a small form factor. Saturation of haptic devices limits the mechanical properties and interactions that can be rendered in a virtual environment, specifically the weight that can be rendered when a user manipulates a virtual object. We use scaled inertial forces to alter virtual weight perception during a dynamic grasp-lift-and-place task in a virtual environment with haptic feedback via two wearable skin deformation feedback devices. A study was conducted, beginning with an open response exercise to assess how participants interpreted scaled inertial forces when interacting with virtual blocks. Participants then performed a series of trials to measure the Point of Subjective Equality of virtual weight with scaled inertial forces, using a reference of 200 g under the normal (no scaling) condition. PSEs for inertial scaling factors of 2 and 3 were 171 g and 151 g, respectively. These results demonstrate the effectiveness of a unique haptic rendering algorithm that can convey larger weights without saturating the force output of the haptic device.