雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Real-time gaze prediction in virtual reality

Note: We don't have the ability to review paper

PubDate: June 2022

Teams: Aalto University;University of Helsinki

Writers: Gazi Karam Illahi;Matti Siekkinen;Teemu Kämäräinen;Antti Ylä-Jääski

PDF: Real-time gaze prediction in virtual reality

Abstract

Gaze is an important indicator of visual attention and knowledge of gaze location can be used to improve and augment Virtual Reality (VR) experiences. This has led to the development of VR Head Mounted Displays (HMD) with inbuilt gaze trackers. Given the latency constraints of VR, foreknowledge of gaze, i.e., before it is reported by the gaze tracker, can similarly be leveraged to preemptively apply gaze-based improvements and augmentations to a VR experience, especially in distributed VR architectures. In this paper, we propose a light weight neural network based method utilizing only past HMD pose and gaze data to predict future gaze locations, forgoing computationally heavy saliency computation. Most work in this domain has focused on either 360°or ego-centric video or synthetic VR content with rather naive interaction dynamics like free viewing or supervised visual search tasks. Our solution considers data from the exhaustive OpenNEEDs dataset which contains 6 Degrees of Freedom (6DoF) data captured in VR experiences with subjects given the freedom to explore the VR scene and/or to engage in tasks. Our solution outperforms the very strict baseline: current gaze to predict gaze in real-time for sub 150ms prediction horizons for VR use-cases.

您可能还喜欢...

Paper