雨果巴拉:行业北极星Vision Pro过度设计不适合市场

NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation

Note: We don't have the ability to review paper

Title: NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation

Teams: Nvidia;UNC

Writers: Joohwan Kim;Michael Stengel;Alexander Majercik (Nvidia);Shalini De Mello;David Dunn (UNC);Samuli Laine;Morgan McGuire;David Luebke

Publication date: May 4, 2019

Abstract

Quality, diversity, and size of training dataset are critical factors for learning-based gaze estimators. We create two datasets satisfying these criteria for near-eye gaze estimation under infrared illumination: a synthetic dataset using anatomically-informed eye and face models with variations in face shape, gaze direction, pupil and iris, skin tone, and external conditions (two million images at 1280×960), and a real-world dataset collected with 35 subjects (2.5 million images at 640×480). Using our datasets, we train a neural network for gaze estimation, achieving 2.06 (+/- 0.44) degrees of accuracy across a wide 30 x 40 degrees field of view on real subjects excluded from training and 0.5 degrees best-case accuracy (across the same field of view) when explicitly trained for one real subject. We also train a variant of our network to perform pupil estimation, showing higher robustness than previous methods. Our network requires fewer convolutional layers than previous networks, achieving sub-millisecond latency.

您可能还喜欢...

Paper