空 挡 广 告 位 | 空 挡 广 告 位

Eye Tracking-based LSTM for Locomotion Prediction in VR

Note: We don't have the ability to review paper

PubDate: April 2022

Teams: University of Muenster

Writers: Niklas Stein; Gianni Bremer; Markus Lappe

PDF: Eye Tracking-based LSTM for Locomotion Prediction in VR

Abstract

Virtual Reality (VR) allows users to perform natural movements such as hand movements, turning the head and natural walking in virtual environments. While such movements enable seamless natural interaction, they come with the need for a large tracking space, particularly in the case of walking. To optimise use of the available physical space, prediction models for upcoming behavior are helpful. In this study, we examined whether a user’s eye movements tracked by current VR hardware can improve such predictions. Eighteen participants walked through a virtual environment while performing different tasks, including walking in curved paths, avoiding or approaching objects, and conducting a search. The recorded position, orientation and eye-tracking features from 2.5 s segments of the data were used to train an LSTM model to predict the user’s position 2.5 s into the future. We found that future positions can be predicted with an average error of 65 cm. The benefit of eye movement data depended on the task and environment. In particular, situations with changes in walking speed benefited from the inclusion of eye data. We conclude that a model utilizing eye tracking data can improve VR applications in which path predictions are helpful.

您可能还喜欢...

Paper