Emotion-robust EEG Classification for Motor Imagery
PubDate: May 2020
Teams: Technical University of Munich
Writers: Abdul Moeed
PDF: Emotion-robust EEG Classification for Motor Imagery
Abstract
Developments in Brain Computer Interfaces (BCIs) are empowering those with severe physical afflictions through their use in assistive systems. Common methods of achieving this is via Motor Imagery (MI), which maps brain signals to code for certain commands. Electroencephalogram (EEG) is preferred for recording brain signal data on account of it being non-invasive. Despite their potential utility, MI-BCI systems are yet confined to research labs. A major cause for this is lack of robustness of such systems. As hypothesized by two teams during Cybathlon 2016, a particular source of the system’s vulnerability is the sharp change in the subject’s state of emotional arousal. This work aims towards making MI-BCI systems resilient to such emotional perturbations. To do so, subjects are exposed to high and low arousal-inducing virtual reality (VR) environments before recording EEG data. The advent of COVID-19 compelled us to modify our methodology. Instead of training machine learning algorithms to classify emotional arousal, we opt for classifying subjects that serve as proxy for each state. Additionally, MI models are trained for each subject instead of each arousal state. As training subjects to use MI-BCI can be an arduous and time-consuming process, reducing this variability and increasing robustness can considerably accelerate the acceptance and adoption of assistive technologies powered by BCI.