Emotion visualization in Virtual Reality: An integrative review

Note: We don't have the ability to review paper

PubDate: Dec 2020

Teams: , Technische Universität;University of Technology Sydney UTS;German Research Center for Artificial Intelligence

Writers: Andres Pinilla, Jaime Garcia, William Raffe, Jan-Niklas Voigt-Antons, Robert Spang, Sebastian Möller

PDF: Emotion visualization in Virtual Reality: An integrative review


A cluster of research in Human-Computer Interaction (HCI) suggests that it is possible to infer some characteristics of users’ mental states by analyzing electrophysiological responses in real-time. However, it is not clear how to use the information extracted from electrophysiological signals to create visual representations of the emotional states of Virtual Reality (VR) users. Visualization of users’ emotions in VR can lead to biofeedback therapies for training emotion self-regulation. Understanding how to visualize emotions in VR requires an interdisciplinary approach that integrates disciplines such as psychology, electrophysiology, and audiovisual design. Therefore, this review aims to integrate previous studies from these fields to understand how to develop virtual environments that can automatically create visual representations of users’ emotional states. This manuscript addresses this challenge in three sections: First, theories related to emotion and affect are compared. Second, evidence suggesting that specific visual and sound cues tend to be associated with particular emotions are discussed. And third, some of the available methods for assessing emotions are described.