Advancing 6D Pose Estimation in Augmented Reality – Overcoming Projection Ambiguity with Uncontrolled Imagery
PubDate: March 2024
Teams: Kyungpook National University
Writers: Mayura Manawadu, Sieun Park, Soon-Yong Park
Abstract
This study addresses the challenge of accurate 6D pose estimation in Augmented Reality (AR), a critical component for seamlessly integrating virtual objects into real-world environments. Our research primarily addresses the difficulty of estimating 6D poses from uncontrolled RGB images, a common scenario in AR applications, which lacks metadata such as focal length. We propose a novel approach that strategically decomposes the estimation of z-axis translation and focal length, leveraging the neural-render and compare strategy inherent in the FocalPose architecture. This methodology not only streamlines the 6D pose estimation process but also significantly enhances the accuracy of 3D object overlaying in AR settings. Our experimental results demonstrate a marked improvement in 6D pose estimation accuracy, with promising applications in manufacturing and robotics. Here, the precise overlay of AR visualizations and the advancement of robotic vision systems stand to benefit substantially from our findings.