空 挡 广 告 位 | 空 挡 广 告 位

Real-Time Gaze Tracking with Event-Driven Eye Segmentation

Note: We don't have the ability to review paper

PubDate: Jan 2022

Teams: University of Rochester;Reality Labs Research

Writers: Yu Feng, Nathan Goulding-Hotta, Asif Khan, Hans Reyserhove, Yuhao Zhu

PDF: Real-Time Gaze Tracking with Event-Driven Eye Segmentation

Abstract

Gaze tracking is increasingly becoming an essential component in Augmented and Virtual Reality. Modern gaze tracking al gorithms are heavyweight; they operate at most 5 Hz on mobile processors despite that near-eye cameras comfortably operate at a r eal-time rate (> 30 Hz). This paper presents a real-time eye tracking algorithm that, on average, operates at 30 Hz on a mobile processor, achieves \ang{0.1}–\ang{0.5} gaze accuracies, all the while requiring only 30K parameters, one to two orders of magn itude smaller than state-of-the-art eye tracking algorithms. The crux of our algorithm is an Auto~ROI mode, which continuously pr edicts the Regions of Interest (ROIs) of near-eye images and judiciously processes only the ROIs for gaze estimation. To that end, we introduce a novel, lightweight ROI prediction algorithm by emulating an event camera. We discuss how a software emulation of events enables accurate ROI prediction without requiring special hardware. The code of our paper is available at this https URL.

您可能还喜欢...

Paper