Optimizing Visual Element Placement via Visual Attention Analysis
PubDate: March 2019
Teams: George Mason University;University of Massachusetts Boston;Nihon Hoso Kyokai
Writers: Rawan Alghofaili; Michael S Solah; Haikun Huang; Yasuhito Sawahata; Marc Pomplun; Lap-Fai Yu
PDF: Optimizing Visual Element Placement via Visual Attention Analysis
Abstract
Eye-tracking enables researchers to conduct complex analysis on human behavior. With the recent introduction of eye-tracking into consumer-grade virtual reality headsets, the barrier of entry to visual attention analysis in virtual environments has been lowered significantly. Whether for arranging artwork in a virtual museum, posting banners for virtual events or placing advertisements in virtual worlds, analyzing visual attention patterns provides a powerful means for guiding visual element placement. In this work, we propose a novel data-driven optimization approach for automatically analyzing visual attention and placing visual elements in 3D virtual environments. Using an eye-tracking virtual reality headset, we collect eye-tracking data which we use to train a regression model for predicting gaze duration. We then use the predicted gaze duration output of our regressors to optimize the placement of visual elements with respect to certain visual attention and design goals. Through experiments in several virtual environments, we demonstrate the effectiveness of our optimization approach for predicting gaze duration and for placing visual elements in different practical scenarios. Our approach is implemented as a useful plug-in that level designers can use to automatically populate visual elements in 3D virtual environments.