空 挡 广 告 位 | 空 挡 广 告 位

VREED: Virtual Reality Emotion Recognition Dataset Using Eye Tracking & Physiological Measures

Note: We don't have the ability to review paper

PubDate: December 2021

Teams: University of Kent;Prince of Songkla University;BBC Research & Development,

Writers: Luma Tabbaa;Ryan Searle;Saber Mirzaee Bafti;Md Moinul Hossain;Jittrapol Intarasisrisawat;Maxine Glancy;Chee Siang Ang

PDF: VREED: Virtual Reality Emotion Recognition Dataset Using Eye Tracking & Physiological Measures

Abstract

The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which emotions were triggered using immersive 360° Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1–3 min each) selected through focus groups and a pilot trial. Statistical analysis confirmed the validity of the selected 360-VEs in eliciting the desired emotions. Preliminary machine learning analysis was carried out, demonstrating state-of-the-art performance reported in affective computing literature using non-immersive modalities. VREED is among the first multimodal VR datasets in emotion recognition using behavioural and physiological signals. VREED is made publicly available on Kaggle1. We hope that this contribution encourages other researchers to utilise VREED further to understand emotional responses in VR and ultimately enhance VR experiences design in applications where emotional elicitation plays a key role, i.e. healthcare, gaming, education, etc.

您可能还喜欢...

Paper