Translating Universal Scene Descriptions into Knowledge Graphs for Robotic Environment
PubDate: Oct 2023
Teams: University of Bremen
Writers: Giang Hoang Nguyen, Daniel Bessler, Simon Stelter, Mihai Pomarlan, Michael Beetz
PDF: Translating Universal Scene Descriptions into Knowledge Graphs for Robotic Environment
Abstract
Robots performing human-scale manipulation tasks require an extensive amount of knowledge about their surroundings in order to perform their actions competently and human-like. In this work, we investigate the use of virtual reality technology as an implementation for robot environment modeling, and present a technique for translating scene graphs into knowledge bases. To this end, we take advantage of the Universal Scene Description (USD) format which is an emerging standard for the authoring, visualization and simulation of complex environments. We investigate the conversion of USD-based environment models into Knowledge Graph (KG) representations that facilitate semantic querying and integration with additional knowledge sources.