Real-Time Shadow Detection From Live Outdoor Videos for Augmented Reality
PubDate: November 2020
Teams: Sichuan University;University of South Carolina
Writers: Yanli Liu; Xingming Zou; Songhua Xu; Guanyu Xing; Housheng Wei; Yanci Zhang
PDF: Real-Time Shadow Detection From Live Outdoor Videos for Augmented Reality
Abstract
Simulating shadow interactions between real and virtual objects is important for augmented reality (AR), in which accurately and efficiently detecting real shadows from live videos is a crucial step. Most of the existing methods are capable of processing only scenes captured under a fixed viewpoint. In contrast, this article proposes a new framework for shadow detection in live outdoor videos captured under moving viewpoints. The framework splits each frame into a tracked region, which is the region tracked from the previous video frame through optical flow analysis, and an emerging region, which is newly introduced into the scene due to the moving viewpoint. The framework subsequently extracts features based on the intensity profiles surrounding the boundaries of candidate shadow regions. These features are then utilized to both correct erroneous shadow boundaries for the tracked region and to detect shadow boundaries for the emerging region by a Bayesian learning module. To remove spurious shadows, spatial layout constraints are further considered for emerging regions. The experimental results demonstrate that the proposed framework outperforms the state-of-the-art shadow tracking and detection algorithms on a variety of challenging cases in real time, including shadows on backgrounds with complex textures, nonplanar shadows, fast-moving shadows with changing typologies, and shadows cast by nonrigid objects. The quantitative experiments show that our method outperforms the best existing method, achieving a 33.3% increase in the average FmeasureFmeasure on a self-collected database. Coupled with an image-based shadow-casting method, the proposed framework generates realistic shadow interaction results. This capability will be particularly beneficial for supporting AR applications.