空 挡 广 告 位 | 空 挡 广 告 位

Power-efficient and shift-robust eye-tracking sensor for portable VR headsets

Note: We don't have the ability to review paper

PubDate: June 2019

Teams: Texas State University

Writers: Dmytro Katrychuk;Henry K. Griffith;Oleg V. Komogortsev

PDF: Power-efficient and shift-robust eye-tracking sensor for portable VR headsets

Abstract

Photosensor oculography (PSOG) is a promising solution for reducing the computational requirements of eye tracking sensors in wireless virtual and augmented reality platforms. This paper proposes a novel machine learning-based solution for addressing the known performance degradation of PSOG devices in the presence of sensor shifts. Namely, we introduce a convolutional neural network model capable of providing shift-robust end-to-end gaze estimates from the PSOG array output. Moreover, we propose a transfer-learning strategy for reducing model training time. Using a simulated workflow with improved realism, we show that the proposed convolutional model offers improved accuracy over a previously considered multilayer perceptron approach. In addition, we demonstrate that the transfer of initialization weights from pre-trained models can substantially reduce training time for new users. In the end, we provide the discussion regarding the design trade-offs between accuracy, training time, and power consumption among the considered models.

您可能还喜欢...

Paper