A 3D Mixed Reality Interface for Human-Robot Teaming

Note: We don't have the ability to review paper

PubDate: Oct 2023

Teams: ETH Zurich;Microsoft Mixed Reality

Writers: Jiaqi Chen, Boyang Sun, Marc Pollefeys, Hermann Blum

PDF: A 3D Mixed Reality Interface for Human-Robot Teaming


This paper presents a mixed-reality human-robot teaming system. It allows human operators to see in real-time where robots are located, even if they are not in line of sight. The operator can also visualize the map that the robots create of their environment and can easily send robots to new goal positions. The system mainly consists of a mapping and a control module. The mapping module is a real-time multi-agent visual SLAM system that co-localizes all robots and mixed-reality devices to a common reference frame. Visualizations in the mixed-reality device then allow operators to see a virtual life-sized representation of the cumulative 3D map overlaid onto the real environment. As such, the operator can effectively “see through” walls into other rooms. To control robots and send them to new locations, we propose a drag-and-drop interface. An operator can grab any robot hologram in a 3D mini map and drag it to a new desired goal pose. We validate the proposed system through a user study and real-world deployments. We make the mixed-reality application publicly available at this https URL.

You may also like...