空 挡 广 告 位 | 空 挡 广 告 位

Robust Object Pose Tracking for Augmented Reality Guidance and Teleoperation

Note: We don't have the ability to review paper

Date:May 2024

Teams:University of British Columbia

Writers:David Black; Septimiu Salcudean

PDF:Robust Object Pose Tracking for Augmented Reality Guidance and Teleoperation

Abstract

For many augmented reality guidance, teleoperation, or human–robot interaction systems, accurate, fast, and robust six-degree-of-freedom (6-DOF) object pose tracking is essential. However, current solutions easily lose tracking when line-of-sight to markers is lost. In this article, we present a tracking system that matches or improves on current methods in speed and accuracy, achieving 1.77 mm and 1.51° accuracy at 22 Hz, and is robust to occlusions. Reflective markers are segmented in infrared (IR) images and used for pose computation using novel voting-based point correspondence algorithms and intelligent cropping. In addition, we introduce a new square-root unscented Kalman filter (UKF), which improves accuracy and flexibility over previous approaches by tracking the markers themselves rather than the computed pose and enabling fusion of an external inertial measurement unit (IMU). This reduces noise and makes the tracking robust to brief loss of line-of-sight. The algorithms and methods are described in detail with pseudocode, tested, and analyzed. The system is implemented in simulation and on a Microsoft HoloLens 2 using Unity for ease of integration into graphical projects. The code is made available open source. Through the improvements in speed and robustness over previous methods, this solution has the potential to enable fast and reliable pose tracking for many mixed reality (MR) and teleoperation applications.

您可能还喜欢...

Paper