Data-driven sparse skin stimulation can convey social touch information to humans

Note: We don't have the ability to review paper

PubDate: January 1, 2022

Teams: Stanford University;University of Southern California;Facebook

Writers: Mike Salvato, Sophia R. Williams, Cara M. Nunez, Xin Zhu, Ali Israr, Frances Lau, Keith Klumb, Freddy Abnousi, Allison M. Okamura, Heather Culbertson

PDF: Data-driven sparse skin stimulation can convey social touch information to humans

Abstract

During social interactions, people use auditory, visual, and haptic cues to convey their thoughts, emotions, and intentions. Due to weight, energy, and other hardware constraints, it is difficult to create devices that completely capture the complexity of human touch. Here we explore whether a sparse representation of human touch is sufficient to convey social touch signals. To test this we collected a dataset of social touch interactions using a soft wearable pressure sensor array, developed an algorithm to map recorded data to an array of actuators, then applied our algorithm to create signals that drive an array of normal indentation actuators placed on the arm. Using this wearable, low-resolution, low-force device, we find that users are able to distinguish the intended social meaning, and compare performance to results based on direct human touch. As online communication becomes more prevalent, such systems to convey haptic signals could allow for improved distant socializing and empathetic remote human-human interaction.

You may also like...

Paper