空 挡 广 告 位 | 空 挡 广 告 位

Real Time Egocentric Object Segmentation for Mixed Reality: THU-READ Labeling and Benchmarking Results

Note: We don't have the ability to review paper

PubDate: April 2022

Teams: Nokia Bell-Labs

Writers: E. Gonzalez-Sosa; G. Robledo; D. Gonzalez-Morin; P. Perez-Garcia; A. Villegas

PDF: Real Time Egocentric Object Segmentation for Mixed Reality: THU-READ Labeling and Benchmarking Results

Abstract

Egocentric segmentation has attracted recent interest in the computer vision community due to their potential in Mixed Reality (MR) applications. While most previous works have been focused on segmenting egocentric human body parts (mainly hands), little attention has been given to egocentric objects. Due to the lack of datasets of pixel-wise annotations of egocentric objects, in this paper we con-tribute with a semantic-wise labeling of a subset of 2124 images from the RGB-D THU-READ Dataset. We also report benchmarking results using Thundernet, a real-time seman-tic segmentation network, that could allow future integration with end-to-end MR applications. A comparison with depth-based segmentation and a useful discussion regarding the suitability of the different algorithms in MR is also given.

您可能还喜欢...

Paper