Real-time Retinal Localization for Eye-tracking in Head-mounted Displays
PubDate: June 2020
Teams: Univerisity of Washington, Magic Leap
Writers: Chen Gong, Laura Trutiou, Brian Schowengerdt, Steven L Brunton, Eric J Seibel
PDF: Real-time Retinal Localization for Eye-tracking in Head-mounted Displays
Abstract
Accurate and robust eye-tracking is highly desirable in head-mounted displays. A method of using retina movement videos for estimating eye gaze is investigated in this work. We localize each frame of the retinal movement video on a mosaicked large field of view search image. The localization is based on a Kalman filter, which embeds deep learning in the estimation process with image registration as the measurement. This algorithm is demonstrated in experiments, where the retinal movement videos are captured from a dynamic real phantom. The average localization accuracy of our algorithm is 0.68°, excluding the annotation error. The classic pupil-glint eye tracking method has an average error of 0.5°-1°, while using retina videos results in a tracking resolution of 0.05° per pixel, which is nearly 20 times higher than that of pupil-glint methods. The accuracy of our inherently robust method is expected to be improved with further development.