雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Real-time 3D virtual dressing based on users’ skeletons

Note: We don't have the ability to review paper

PubDate: January 2018

Teams: Beijing Language and Culture University

Writers: Ting Liu ; LingZhi Li ; XiWen Zhang

PDF: Real-time 3D virtual dressing based on users’ skeletons

Abstract

3D virtual dressing allows users to try-on clothes’ models without physically putting on and off as well as to attain 3D virtual fitting superior to 2D virtual dressing. It includes three components in a virtual 3D space: models of users and clothes, and their fitting. This paper presents real-time 3D virtual dressing based on users’ skeletons. Users’ skeletons are extracted and tracked in real-time to drive transformation and fitting of clothes’ models. After clothes’ models are rigged and weighted based on users’ skeletons, skinned clothes’ models are transformed and fitted to users with dynamic physical effects. They match users’ bodies and actions, resulting in synchronous transformation: scale, rotation, and deformation. A software prototype is developed in Unity 5.2 and Microsoft Kinect for Windows V2. The prototype allows users to select different clothes’ models and background images using motion-sensing gestures to view various collocations after background removal. Some users and clothes are tested, and satisfactory results are attained.

您可能还喜欢...

Paper