IntelliPupil: Pupillometric Light Modulation for Optical See-Through Head-Mounted Displays
PubDate: January 2019
Teams: Osaka University;Nara Institute of Science and Technology
Writers: Chang Liu; Alexander Plopski; Kiyoshi Kiyokawa; Photchara Ratsamee; Jason Orlosky
PDF: IntelliPupil: Pupillometric Light Modulation for Optical See-Through Head-Mounted Displays
Abstract
In practical use of optical see-through head-mounted displays, users often have to adjust the brightness of virtual content to ensure that it is at the optimal level. Automatic adjustment is still a challenging problem, largely due to the bidirectional nature of the structure of the human eye, complexity of real world lighting, and user perception. Allowing the right amount of light to pass through to the retina requires a constant balance of incoming light from the real world, additional light from the virtual image, pupil contraction, and feedback from the user. While some automatic light adjustment methods exist, none have completely tackled this complex input-output system. As a step towards overcoming this issue, we introduce IntelliPupil, an approach that uses eye tracking to properly modulate augmentation lighting for a variety of lighting conditions and real scenes. We first take the data from a small form factor light sensor and changes in pupil diameter from an eye tracking camera as passive inputs. This data is coupled with user-controlled brightness selections, allowing us to fit a brightness model to user preference using a feed-forward neural network. Using a small amount of training data, both scene luminance and pupil size are used as inputs into the neural network, which can then automatically adjust to a user’s personal brightness preferences in real time. Experiments in a high dynamic range AR scenario with varied lighting show that pupil size is just as important as environment light for optimizing brightness and that our system outperforms linear models.