空 挡 广 告 位 | 空 挡 广 告 位

Estimating Gaze Depth Using Multi-Layer Perceptron

Note: We don't have the ability to review paper

PubDate: July 2017

Teams: Mokpo National University; KETI;Keio University;University of South Australia

Writers: Youngho Lee; Choonsung Shin; Alexander Plopski; Yuta Itoh; Thammathip Piumsomboon; Arindam Dey; Gun Lee; Seungwon Kim;Mark Billinghurst

PDF: Estimating Gaze Depth Using Multi-Layer Perceptron

Abstract

In this paper we describe a new method for determining gaze depth in a head mounted eye-tracker. Eye-trackers are being incorporated into head mounted displays (HMDs), and eye-gaze is being used for interaction in Virtual and Augmented Reality. For some interaction methods, it is important to accurately measure the x-and y-direction of the eye-gaze and especially the focal depth information. Generally, eye tracking technology has a high accuracy in x-and y-directions, but not in depth. We used a binocular gaze tracker with two eye cameras, and the gaze vector was input to an MLP neural network for training and estimation. For the performance evaluation, data was obtained from 13 people gazing at fixed points at distances from 1m to 5m. The gaze classification into fixed distances produced an average classification error of nearly 10%, and an average error distance of 0.42m. This is sufficient for some Augmented Reality applications, but more research is needed to provide an estimate of a user’s gaze moving in continuous space.

您可能还喜欢...

Paper