Distrubted Generation of Large-scale 3D Dense Point Cloud for Accurate Multi-View Reconstruction
PubDate: October 2020
Teams: Beihang University;Qingdao Research Institute of Beihang University
Writers: Xijing Wang; Yao Li; Chen Wang; Yue Qi
PDF: Distrubted Generation of Large-scale 3D Dense Point Cloud for Accurate Multi-View Reconstruction
Abstract
In recent decades, computer vision and computer graphics especially 3D reconstruction have been a hot area of computer research. As the scenes reconstructed from multi-view images become larger and larger, a single machine can no longer meet the requirement of the dense point cloud reconstruction in large-scale scene reconstruction. In this paper, a distributed method is proposed to reconstruct the dense point cloud for accurate multi-view reconstruction. First, the initial image set is constructed into a graph, and the graph is divided into several graphs using the improved graph cut algorithm, so the image set will be parted into several small image sets according to the result of the graph cut. Then, the generation and optimization of dense point cloud will be performed on different nodes of the cluster. Finally, the dense point clouds generated on different machines will be merged on the primary node to generate a dense point cloud of the entire scene. Experiments on public large data sets and our own large-scale aerial photography show that the distributed method is fast, robust, and suitable for various large scene areas.