A Database for Perceived Quality Assessment of User-Generated VR Videos
PubDate: Jun 2022
Teams: Jiangxi University of Finance and Economics；City University of
Writers: Yuming Fang, Yiru Yao, Xiangjie Sui, Kede Ma
Virtual reality (VR) videos (typically in the form of 360∘ videos) have gained increasing attention due to the fast development of VR technologies and the remarkable popularization of consumer-grade 360∘ cameras and displays. Thus it is pivotal to understand how people perceive user-generated VR videos, which may suffer from commingled authentic distortions, often localized in space and time. In this paper, we establish one of the largest 360∘ video databases, containing 502 user-generated videos with rich content and distortion diversities. We capture viewing behaviors (i.e., scanpaths) of 139 users, and collect their opinion scores of perceived quality under four different viewing conditions (two starting points × two exploration times). We provide a thorough statistical analysis of recorded data, resulting in several interesting observations, such as the significant impact of viewing conditions on viewing behaviors and perceived quality. Besides, we explore other usage of our data and analysis, including evaluation of computational models for quality assessment and saliency detection of 360∘ videos. We have made the dataset and code available at this https URL.