Demonstrating Electrical Head Actuation: Enabling Interactive Systems to Directly Manipulate Head Orientation
PubDate: April 2022
Teams: University of Chicago
Writers: Yudai Tanaka;Jun Nishida;Pedro Lopes
Abstract
We demonstrate a novel interface concept in which interactive systems directly manipulate the user’s head orientation. We implement this using electrical-muscle-stimulation (EMS) of the neck muscles, which turns the head around its yaw (left/right) and pitch (up/down) axis. As the first exploration of EMS for head actuation, we characterized which muscles can be robustly actuated. Then, we demonstrated how it enables interactions not possible before by building a range of applications, such as (1) directly changing the user’s head orientation to locate objects in AR; (2) a sound controller that uses neck movements as both input and output; (3) synchronizing head orientations of two users, which enables a user to communicate head nods to another user while listening to music; and (4) rendering force feedback from VR punches on the head by actuating the user’s neck.