雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Modeling the affective space of 360 virtual reality videos based on arousal and valence for wearable EEG-based VR emotion classification

Note: We don't have the ability to review paper

PubDate: May 2018

Teams: Universiti Malaysia Sabah;

Writers: Nazmi Sofian Suhaimi; Chrystalle Tan Bih Yuan; Jason Teo; James Mountstephens

PDF: Modeling the affective space of 360 virtual reality videos based on arousal and valence for wearable EEG-based VR emotion classification

Abstract

This study attempts to produce a novel database for emotional analysis which uses virtual reality (VR) contents, obtained from third party sources such as YouTube, Discovery VR, Jaunt VR, NYT VR, Veer VR and Google Cardboard, as the visual stimuli in the classification of emotion using commercial-of-the-shelf (COTS) wearable electroencephalography (EEG) headsets. While there are available sources for emotional analysis such as Dataset for Emotion Analysis using EEG, Physiological and video signals (DEAP) dataset presented by Koelstra et al. and Database for Emotional Analysis in Music (DEAM) dataset by Soleymani et al, their contents are focused on using music stimuli and music video stimuli. The database which will be presented here will consist of novel affective taggings using virtual reality content, specifically on Youtube 360 videos, as evaluated by 15 participants based on the Arousal-Valence emotion model (AVS). The feedback obtained from these evaluations will serve as the underlying dataset for the next stage of machine learning implementation, which is the targeted emotion classification of virtual reality stimuli using wearable EEG headsets.

您可能还喜欢...

Paper