Hasti: Haptic and Audio Synthesis for Texture Interactions
Hasti: Haptic and Audio Synthesis for Texture Interactions
PubDate: July 6, 2021
Teams: Facebook Reality Labs
Writers: Sonny Chan, Chase Tymms, Nicholas Colonnese
PDF: Hasti: Haptic and Audio Synthesis for Texture Interactions
Abstract
Multi-sensory stimuli can greatly enhance immersion in interactive virtual environments. Advances in graphics algorithms and technologies like VR displays have pushed the appearance of interactive virtual worlds to unprecedented fidelity, but rendering sound and, especially, touch feedback of comparable quality remains a challenge. We describe a method for real-time synthesis of vibrotactile haptic and audio stimuli for interactions with textured surfaces in 3-D virtual environments. Standard descriptions of object geometry and material properties, including displacement and roughness texture maps typically used for physically-based visual rendering, are employed to generate realistic sound and touch feedback consistent with appearance. Our method reconstructs meso-and microscopic surface features on the fly along a contact trajectory, and runs a micro-contact dynamics simulation whose outputs drive vibrotactile haptic actuators and modal sound synthesis. An exploratory, absolute identification user study was conducted as an initial evaluation of our synthesis methods.