3D Positioning System Based on One-handed Thumb Interactions for 3D Annotation Placement
PubDate: August 2019
Teams: Kyushu University
Writers: So Tashiro; Hideaki Uchiyama; Diego Thomas; Rin-ichiro Taniguchi
PDF: 3D Positioning System Based on One-handed Thumb Interactions for 3D Annotation Placement
Abstract
This paper presents a 3D positioning system based on one-handed thumb interactions for simple 3D annotation placement with a smart-phone. To place an annotation at a target point in the real environment, the 3D coordinate of the point is computed by interactively selecting the corresponding points in multiple views by users while performing SLAM. Generally, it is difficult for users to precisely select an intended pixel on the touchscreen. Therefore, we propose to compute the 3D coordinate from multiple observations with a robust estimator to have the tolerance to the inaccurate user inputs. In addition, we developed three pixel selection methods based on one-handed thumb interactions. A pixel is selected at the thumb position at a live view in FingAR, the position of a reticle marker at a live view in SnipAR, or that of a movable reticle marker at a freezed view in FreezAR. In the preliminary evaluation, we investigated the 3D positioning accuracy of each method.