Towards EEG-based eye-tracking for interaction design in head-mounted devices

Note: We don't have the ability to review paper

PubDate: December 2017

Teams: Deggendorf Institute of Technology;Julius Maximilian University of Wiirzburg

Writers: Marc Philipp Dietrich ; Götz Winterfeldt ; Sebastian von Mammen

PDF: Towards EEG-based eye-tracking for interaction design in head-mounted devices

Abstract

Augmented Reality (AR), Mixed Reality (MR) and Virtual Reality (VR) have an increasing impact on our daily lives. They improve workers’ performance in industry and medicine. In addition gaming and entertainment profit from these innovations. The enabling devices are often head-mounted. New intuitive interaction methods must be developed to control application, because conventional input devices such as keyboards, touch screens or classical touch pads cannot be used. This paper introduces electroencephalographic (EEG) eye-tracking as an additional interaction channel. Ocular artefacts (EOG) in the EEG signal are used to detect eye positions. A Brain-Computer Interface (BCI) is used to capture the data. Data is processed and EOG artefacts are extracted to compute the position of the pupil. The first test runs confirm that ocular artefacts in the EEG signal are strongly correlated with the position of the pupils. Extreme positions of the pupils (horizontal left, horizontal right, vertical up and vertical down) can be detected with high accuracy (true-positive-rate of up to 96,6%). In future tests, we successively refine the direction and speed of eye movement and verify their usage under real-time conditions.

You may also like...

Paper