空 挡 广 告 位 | 空 挡 广 告 位

Visible Patches for Haptic Rendering of Point Clouds

Note: We don't have the ability to review paper

PubDate: April 2022

Teams: Southeast University

Writers: Lifeng Zhu; Yichen Xiang; Aiguo Song

PDF: Visible Patches for Haptic Rendering of Point Clouds

Abstract

Unsorted three dimensional (3D) points are commonly acquired from modern tools and they become popular in many virtual reality applications. In order to produce the haptic feedback to enrich the interaction with the captured models, the point clouds are usually converted to structured meshes or implicit representations. The conversion is either time-consuming or not precise, making the haptic rendering with a low fidelity especially for small haptic proxies. We propose to locally reconstruct the points to balance the performance and quality for the haptic rendering of point clouds. We introduce visible patches on the point clouds by noticing that only the points which are visible to the haptic proxy form the candidate contact region. A computational model for the visible patches is introduced and a virtual coupling model is built to update the visible patches online for haptic rendering. The cases with noises and nonuniform samples are also discussed. We demonstrate our method on a set of synthesized and captured 3D point clouds. Various experimental results are collected and show the efficiency of our method.

您可能还喜欢...

Paper