Tactile Rendering Based on Skin Stress Optimization
PubDate: July 2020
Teams: Universidad Rey Juan Carlos
Writers: Mickeal Verschoor, Dan Casas, Miguel A. Otaduy
PDF: Tactile Rendering Based on Skin Stress Optimization
Project: Tactile Rendering Based on Skin Stress Optimization
Abstract
We present a method to render virtual touch, such that the stimulus produced by a tactile device on a user’s skin matches the stimulus computed in a virtual environment simulation. To achieve this, we solve the inverse mapping from skin stimulus to device configuration thanks to a novel optimization algorithm. Within this algorithm, we use a device-skin simulation model to estimate rendered stimuli, we account for trajectory-dependent effects efficiently by decoupling the computation of the friction state from the optimization of device configuration, and we accelerate computations using a neural-network approximation of the device-skin model. Altogether, we enable real-time tactile rendering of rich interactions including smooth rolling, but also contact with edges, or frictional stick-slip motion. We validate our algorithm both qualitatively through user experiments, and quantitatively on a BioTac biomimetic finger sensor.