A Low-cost Approach Towards Streaming 3D Videos of Large-scale Sport Events to Mixed Reality Headsets in Real-time
PubDate: May 2020
Teams: Auto-ID Labs MIT & ETHZ
Writers: Kevin Marty; Prithvi Rajasekaran; Yongbin Sun; Klaus Fuchs
Watching sports events via 3D- instead of two-dimensional video streaming allows for increased immersion, e.g. via mixed reality headsets in comparison to traditional screens. So far, capturing 3D video of sports events required expensive outside-in tracking with numerous cameras. This study demonstrates the feasibility of streaming sports content to mixed reality headsets as holographs in real-time using inside-out tracking and low-cost equipment only. We demonstrate our system by streaming a race car on an indoor track as 3D models, which are then rendered in an Magic Leap One headset. An onboard camera, mounted on the race car provides the video stream used to localize the car via computer vision. The localization is estimated by an end-to-end convolutional neural network (CNN). The study compares three state-of-the-art CNN models in their respective accuracy and execution time, with PoseNet+LSTM achieving position and orientation accuracy of 0.35m and 3.95°. The total streaming latency in this study was 1041ms, suggesting technical feasibility of streaming 3D sports content, e.g. on large playgrounds, in near real-time onto mixed-reality headsets.