空 挡 广 告 位 | 空 挡 广 告 位

Human Emotions Analysis and Recognition Using EEG Signals in Response to 360° Videos

Note: We don't have the ability to review paper

PubDate: Feb 2024

Teams: University of Engineering and Technology; George Washington University

Writers: Haseeb ur Rahman Abbasi, Zeeshan Rashid, Muhammad Majid, Syed Muhammad Anwar

PDF: Human Emotions Analysis and Recognition Using EEG Signals in Response to 360° Videos

Abstract

Emotion recognition (ER) technology is an integral part for developing innovative applications such as drowsiness detection and health monitoring that plays a pivotal role in contemporary society. This study delves into ER using electroencephalography (EEG), within immersive virtual reality (VR) environments. There are four main stages in our proposed methodology including data acquisition, pre-processing, feature extraction, and emotion classification. Acknowledging the limitations of existing 2D datasets, we introduce a groundbreaking 3D VR dataset to elevate the precision of emotion elicitation. Leveraging the Interaxon Muse headband for EEG recording and Oculus Quest 2 for VR stimuli, we meticulously recorded data from 40 participants, prioritizing subjects without reported mental illnesses. Pre-processing entails rigorous cleaning, uniform truncation, and the application of a Savitzky-Golay filter to the EEG data. Feature extraction encompasses a comprehensive analysis of metrics such as power spectral density, correlation, rational and divisional asymmetry, and power spectrum. To ensure the robustness of our model, we employed a 10-fold cross-validation, revealing an average validation accuracy of 85.54\%, with a noteworthy maximum accuracy of 90.20\% in the best fold. Subsequently, the trained model demonstrated a commendable test accuracy of 82.03\%, promising favorable outcomes.

您可能还喜欢...

Paper