Investigating Evoked EEG Responses to Targets Presented in Virtual Reality

Note: We don't have the ability to review paper

PubDate: October 2019

Teams: Columbia University;U.S. Army Research Laboratory

Writers: Pawan Lapborisuth; Josef Faller; Jonathan Koss; Nicholas R. Waytowich; Jonathan Touryan; Paul Sajda

PDF: Investigating Evoked EEG Responses to Targets Presented in Virtual Reality


Virtual reality (VR) offers the potential to study brain function in complex, ecologically realistic environments. However, the additional degrees of freedom make analysis more challenging, particularly with respect to evoked neural responses. In this paper we designed a target detection task in VR where we varied the visual angle of targets as subjects moved through a three dimensional maze. We investigated how the latency and shape of the classic P300 evoked response varied as a function of locking the electroencephalogram data to the target image onset, the target-saccade intersection, and the first fixation on the target. We found, as expected, a systematic shift in the timing of the evoked responses as a function of the type of response locking, as well as a difference in the shape of the waveforms. Interestingly, single-trial analysis showed that the peak discriminability of the evoked responses does not differ between image locked and saccade locked analysis, though it decreases significantly when fixation locked. These results suggest that there is a spread in the perception of visual information in VR environments across time and visual space. Our results point to the importance of considering how information may be perceived in naturalistic environments, specifically those that have more complexity and higher degrees of freedom than in traditional laboratory paradigms.

You may also like...