Spatial Affordance-aware Interactable Subspace Allocation for Mixed Reality Telepresence
PubDate: Aug 2024
Teams:KAIST
Writers:Dooyoung Kim, Seonji Kim, Selin Choi, Woontack Woo
PDF:Spatial Affordance-aware Interactable Subspace Allocation for Mixed Reality Telepresence
Abstract
To enable remote Virtual Reality (VR) and Augmented Reality (AR) clients to collaborate as if they were in the same space during Mixed Reality (MR) telepresence, it is essential to overcome spatial heterogeneity and generate a unified shared collaborative environment by integrating remote spaces into a target host space. Especially when multiple remote users connect, a large shared space is necessary for people to maintain their personal space while collaborating, but the existing simple intersection method leads to the creation of narrow shared spaces as the number of remote spaces increases. To robustly align to the host space even as the number of remote spaces increases, we propose a spatial affordanceaware interactable subspace allocation algorithm. The key concept of our approach is to consider the perceivable and interactable areas separately, where every user views the same mutual space, but each remote user has a different interactable subspace, considering their location and spatial affordance. We conducted an evaluation with 900 space combinations, varying the number of remote spaces as two, four, and six, and results show our method outperformed in securing wide interactable mutual space and instantiating users compared to the other spatial matching methods. Our work enables multiple clients from diverse remote locations to access the AR host’s space, allowing them to interact directly with the table, wall, or floor by aligning their physical subspaces within a connected mutual space.