Multi-Connectivity and Edge Computing for Ultra-Low-Latency Lifelike Virtual Reality
PubDate: June 2020
Teams: New Jersey Institute of Technology；Southern Methodist University
Writers: Jacob Chakareski; Sabyasachi Gupta
We explore a novel multi-user mobile VR system for streaming scalable 8K360^° video at high reliability and immersion fidelity, and low interactive latency, via a synergistic integration of scalable 360^° tiling, dual-band millimeter wave (mmWave) and Wi-Fi transmission, and edge computing. High rate directed mmWave links are studied to send VR viewport-specific high-quality enhancement layers of the 360^° content to the individual users, while Wi-Fi broadcast of the base layer of the entire 360^° panorama is sent to all users, to augment the system’s reliability. The viewport-specific enhancement layers can comprise compressed and raw 360^° tiles, decoded first at the edge server. We explore the joint optimization of the mmWave access point to user association, the choice of 360^° tiles to be transmitted decompressed, the allocation of mmWave data rate across the compressed tiles in a viewport-specific enhancement layer, and the allocation of computing resources at the edge server and user devices. Our objective is to maximize the minimum delivered VR immersion fidelity across all users, given transmission, latency, and computing constraints. We demonstrate that our framework can enable a significant improvement in immersion fidelity (8dB to 10 dB) and spatial resolution (8Kvs. 4K), over MPEG-DASH that uses Wi-Fi transmission only. We also show that an increasing number of raw 360^° tiles are sent, as the mmWave link rate or the edge server/user computing power increase, exploring rigorously here the fundamental interplay between computing and communication capabilities, end-to-end system latency, and delivered VR immersion fidelity.