MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop
Title: MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop
Teams: Microsoft
Writers: Hrvoje Benko Andrew D. Wilson Ricardo Jota
Publication date: May 2012
Abstract
Instrumented with a single depth camera, a stereoscopic projector, and a curved screen, MirageTable is an interactive system designed to merge real and virtual worlds into a single spatially registered experience on top of a table. Our depth camera tracks the user’s eyes and performs a real-time capture of both the shape and the appearance of any object placed in front of the camera (including user’s body and hands). This real-time capture enables perspective stereoscopic 3D visualizations to a single user that account for deformations caused by physical objects on the table. In addition, the user can interact with virtual objects through physically-realistic freehand actions without any gloves, trackers, or instruments. We illustrate these unique capabilities through three application examples: virtual 3D model creation, interactive gaming with real and virtual objects, and a 3D teleconferencing experience that not only presents a 3D view of a remote person, but also a seamless 3D shared task space. We also evaluated the user’s perception of projected 3D objects in our system, which confirmed that users can correctly perceive such objects even when they are projected over different background colors and geometries (e.g., gaps, drops).