Human Movement Direction Prediction using Virtual Reality and Eye Tracking

Note: We don't have the ability to review paper

PubDate: June 2021

Teams: Chalmers University of Technology

Writers: Julius Pettersson; Petter Falkman

PDF: Human Movement Direction Prediction using Virtual Reality and Eye Tracking

Abstract

One way of potentially improving the use of robots in a collaborative environment is through prediction of human intention that would give the robots insight into how the operators are about to behave. An important part of human behaviour is arm movement and this paper presents a method to predict arm movement based on the operator’s eye gaze. A test scenario has been designed in order to gather coordinate based hand movement data in a virtual reality environment. The results shows that the eye gaze data can successfully be used to train an artificial neural network that is able to predict the direction of movement ~500ms ahead of time.

You may also like...

Paper