Ukemochi: A Video See-through Food Overlay System for Eating Experience in the Metaverse
PubDate: April 2022
Teams: Nara Institute of Science and Technology；The University of Tokyo
Writers: Kizashi Nakano；Daichi Horita；Naoya Isoyama；Hideaki Uchiyama；Kiyoshi Kiyokawa
The widespread use of Head-Mounted Displays (HMDs) allows ordinary users to interact with their friends daily in social Virtual Environments (VEs) or metaverse. However, it is not easy to eat in a metaverse while wearing an HMD because the Real Environment (RE) is not visible. Currently, users watch the RE’s food through the gap between the user’s face and the HMD (None) or superimposing a video see-through (VST) image on the VE, but these methods reduce the sense of presence. To allow natural eating in a VE, we propose Ukemochi that improves the presence and ease of eating. Ukemochi seamlessly overlays a food segmentation image inferred by deep neural networks on a VE. Ukemochi can be used simultaneously as a VE created with the OpenVR API and can be easily deployed for the metaverse. In this study, we evaluated the effectiveness of Ukemochi by comparing three visual presentation methods (None, VST, and Ukemochi) and two meal conditions (Hand condition and Plate condition). The experimental results demonstrated that Ukemochi enables users to maintain a high presence in VE and improve the ease of eating. We believe that our study will provide users with the experience of eating in the metaverse and encourage further research on eating in the metaverse.