空 挡 广 告 位 | 空 挡 广 告 位

Real-Time Viewport-Aware Optical Flow Estimation in 360-degree Videos for Visually-Induced Motion Sickness Mitigation

Note: We don't have the ability to review paper

PubDate: Jan 2023

Teams: University of North Carolina at Greensboro

Writers: Zekun Cao, Regis Kopper

PDF: Real-Time Viewport-Aware Optical Flow Estimation in 360-degree Videos for Visually-Induced Motion Sickness Mitigation

Abstract

Visually-induced motion sickness (VIMS), a side effect of illusionary motions caused by visual stimulation, is one of the major obstacles to the widespread use of Virtual Reality (VR). Along with scene object information, the visual stimulation can be primarily indicated by the optical flow, which characterizes the motion pattern, such as the intensity and direction of the moving image. We estimated real-time optical flow in 360-degree videos targeted at immersive user interactive visualization based on the user’s current viewport. The proposed method allows the estimation of the customized visual flow for each experience of dynamic 360-degree videos and is an improvement over previous methods which take into account a single optical flow value for the entire equirectangular frame. We applied our method to modulate the opacity of Granulated Rest Frames (GRF), a novel technique consisting of visual noise-like randomly distributed visual references that are stable to the user’s body during the experience of immersive prerecorded 360-degree videos. We report the results of a preliminary one-day between-subject study with 18 participants where users watched a 2-minute high-intensity 360-degree video. Results show that GRF combined with real-time optical flow estimation may help users be more comfortable when they watch the 360-degree videos, although the improvement is not significant.

您可能还喜欢...

Paper