RGBD GS-ICP SLAM
PubDate: March 2024
Teams: Sungkyunkwan University
Writers: Seongbo Ha , Jiung Yeon , and Hyeonwoo Yu
PDF: RGBD GS-ICP SLAM
Abstract
Simultaneous Localization and Mapping (SLAM) with dense representation plays a key role in robotics, Virtual Reality (VR), and Augmented Reality (AR) applications. Recent advancements in dense representation SLAM have highlighted the potential of leveraging neural scene representation and 3D Gaussian representation for high-fidelity spatial representation. In this paper, we propose a novel dense representation SLAM approach with a fusion of Generalized Iterative Closest Point (G-ICP) and 3D Gaussian Splatting (3DGS). In contrast to existing methods, we utilize a single Gaussian map for both tracking and mapping, resulting in mutual benefits. Through the exchange of covariances between tracking and mapping processes with scale alignment techniques, we minimize redundant computations and achieve an efficient system. Additionally, we enhance tracking accuracy and mapping quality through our keyframe selection methods. Experimental results demonstrate the effectiveness of our approach, showing an incredibly fast speed up to 107 FPS (for the entire system) and superior quality of the reconstructed map. The code is available at: https://github.com/Lab-of-AI-and-Robotics/GS-ICP-SLAM;Video is: https://youtu.be/ebHh_uMMxE