Context-Aware Mixed Reality: A Framework for Ubiquitous Interaction

Note: We don't have the ability to review paper

#{LINK}

PubDate: Mar 2018

Teams: Bournemouth University; University of Chester;School of Informatics at the University of Bradford

Writers: Long Chen, Wen Tang, Nigel John, Tao Ruan Wan, Jian Jun Zhang

PDF: Context-Aware Mixed Reality: A Framework for Ubiquitous Interaction

Abstract

Mixed Reality (MR) is a powerful interactive technology that yields new types of user experience. We present a semantic based interactive MR framework that exceeds the current geometry level approaches, a step change in generating high-level context-aware interactions. Our key insight is to build semantic understanding in MR that not only can greatly enhance user experience through object-specific behaviours, but also pave the way for solving complex interaction design challenges. The framework generates semantic properties of the real world environment through dense scene reconstruction and deep image understanding. We demonstrate our approach with a material-aware prototype system for generating context-aware physical interactions between the real and the virtual objects. Quantitative and qualitative evaluations are carried out and the results show that the framework delivers accurate and fast semantic information in interactive MR environment, providing effective semantic level interactions.

You may also like...

Paper