CNN-based estimation of gaze distance in virtual reality using eye tracking and depth data

Note: We don't have the ability to review paper

PubDate: May 2025

Teams:University of Tübingen

Writers:Anna-Lena von Behren, Yannick Sauer, Björn Severitt, Siegfried Wahl

PDF:CNN-based estimation of gaze distance in virtual reality using eye tracking and depth data

Abstract

Eye tracking in virtual reality (VR) can improve realism and immersion, for example, with gaze-contingent depth-of-field simulations. For this application, knowing the distance of the fixated object, not just the gaze direction, is crucial. One common approach estimates gaze distance from vergence, the relative angle between the eyes, but the accuracy of this method is limited, particularly for larger distances. Alternatively, the gaze distance in VR can be retrieved directly from the depth map at the point of estimated gaze. However, eye tracking inaccuracies may result in the measured gaze being directed at an incorrect object, leading to a wrong distance estimation. This issue can occur particularly when fixating on small targets or edges of objects. To address this, we introduce a CNN-based method, which combines depth map data with vergence information from eye tracking. Our model successfully learns to combine information from both features and outperforms state-of-the-art methods.

您可能还喜欢...

Paper