Mouth Haptics in VR using a Headset Ultrasound Phased Array
PubDate: Apri 2022
Teams: Carnegie Mellon University Future Interfaces Group
Writers: Vivian Shen, Craig Shultz and Chris Harrison
PDF: Mouth Haptics in VR using a Headset Ultrasound Phased Array
Project: Mouth Haptics in VR using a Headset Ultrasound Phased Array
Abstract
Virtual and augmented reality (VR/AR) headsets continue to make impressive strides in immersion and realism, particularly in visual and audio content. However, the delivery of rich tactile sensations continues to be a significant and open challenge. Critically, consumers want robust and integrated solutions – ones that do not require any extra devices or limit freedom of movement. For this reason, vibration motors in handheld controllers are the current consumer state of the art. While more sophisticated approaches exist (e.g., exoskeletons, haptic vests, body-cantilevered accessories, in-room air cannons), they have yet to see even modest consumer adoption.
Simultaneously, the mouth has been largely overlooked as a haptic target in VR/AR, despite being second in terms of sensitivity and density of mechanoreceptors, only behind the fingertips. Equally important, the proximity of the mouth to the headset offers a significant opportunity to enable on- and in-mouth haptic effects, without needing to run wires or wear an extra accessory. However, consumers do not want to cover their entire face, let alone put something up against (or into) their mouth. For AR, the industry is trending towards glasses-like form factors, so as to preserve as much facial expression as possible for human-human communication. Even in VR, smaller headsets are the consumer trend, with the mouth exposed and unencumbered.
In this research, we built a thin, compact, beamforming array of ultrasonic transducers, which could be integrated into future headsets in a practical and consumer-friendly way. We use this hardware to focus air-borne acoustic energy onto the lips and into the mouth, creating sensations such as taps and continuous vibrations, which we can also animate along arbitrary 3D paths. In addition to the lips, our effects can be felt on the teeth and tongue. When coupled with coordinated graphical feedback, the effects are convincing, boosting realism and immersion. We built a variety of sensory demos, including raindrops, mud splatter, pushing through cobwebs, and crawling bugs. While in-air haptics using ultrasonic phased arrays is not new, we are the first to integrate the technology into a headset for use on the mouth and explore the rich application space.