Fixation-based Self-calibration for Eye Tracking in VR Headsets
PubDate: Nov 2023
Teams: Osaka University
Writers: Ryusei Uramune, Sei Ikeda, Hiroki Ishizuka, Osamu Oshiro
PDF: Fixation-based Self-calibration for Eye Tracking in VR Headsets
Abstract
This study proposes a novel self-calibration method for eye tracking in a virtual reality (VR) headset. The proposed method is based on the assumptions that the user’s viewpoint can freely move and that the points of regard (PoRs) from different viewpoints are distributed within a small area on an object surface during visual fixation. In the method, fixations are first detected from the time-series data of uncalibrated gaze directions using an extension of the I-VDT (velocity and dispersion threshold identification) algorithm to a three-dimensional (3D) scene. Then, the calibration parameters are optimized by minimizing the sum of a dispersion metrics of the PoRs. The proposed method can potentially identify the optimal calibration parameters representing the user-dependent offset from the optical axis to the visual axis without explicit user calibration, image processing, or marker-substitute objects. For the gaze data of 18 participants walking in two VR environments with many occlusions, the proposed method achieved an accuracy of 2.1∘, which was significantly lower than the average offset. Our method is the first self-calibration method with an average error lower than 3∘ in 3D environments. Further, the accuracy of the proposed method can be improved by up to 1.2∘ by refining the fixation detection or optimization algorithm.