URLLC-eMBB Slicing to Support VR Multimodal Perceptions over Wireless Cellular Systems
PubDate: May 2018
Teams: University of Oulu
Writers: Jihong Park, Mehdi Bennis
PDF: URLLC-eMBB Slicing to Support VR Multimodal Perceptions over Wireless Cellular Systems
Abstract
Virtual reality (VR) enables mobile wireless users to experience multimodal perceptions in a virtual space. In this paper we investigate the problem of concurrent support of visual and haptic perceptions over wireless cellular networks, with a focus on the downlink transmission phase. While the visual perception requires moderate reliability and maximized rate, the haptic perception requires fixed rate and high reliability. Hence, the visuo-haptic VR traffic necessitates the use of two different network slices: enhanced mobile broadband (eMBB) for visual perception and ultra-reliable and low latency communication (URLLC) for haptic perception. We investigate two methods by which these two slices share the downlink resources orthogonally and non-orthogonally, respectively. We compare these methods in terms of the just-noticeable difference (JND), an established measure in psychophysics, and show that non-orthogonal slicing becomes preferable under a higher target integrated-perceptual resolution and/or a higher target rate for haptic perceptions.