Front Camera Eye Tracking For Mobile VR
PubDate: May 2020
Teams: Technical University of Crete;Durham University
Writers: Panagiotis Drakopoulos; George Alex Koulieris; Katerina Mania
PDF: Front Camera Eye Tracking For Mobile VR
Abstract
User fixations is a fast and natural input method for VR interaction. Previous attempts for mobile eye tracking in VR were limited due to low accuracy, long processing time and the need for hardware add-ons such as anti-reflective lens coating and IR emitters. We present an innovative mobile VR eye tracking methodology, utilizing only the captured images of the front-facing (selfie) camera through the headset’s lens, without any modifications. The system enhances the low-quality camera-captured images that suffer from low contrast and poor lighting by applying a pipeline of customized low level image enhancements to suppress obtrusive reflections due to the headset lenses. We proceed to calibration and linear gaze mapping between the estimated iris centroids and physical pixels on the screen resulting to iris tracking in real-time. A preliminary study confirms that the presented eye tracking methodology performs comparably to eye trackers in commercial VR headsets when the eyes move in the central part of the headset’s field of view.