雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations

Note: We don't have the ability to review paper

PubDate: Mar 2021

Teams: University of Padova

Writers: Umberto Michieli, Pietro Zanuttigh

PDF: Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations

Abstract

Deep neural networks suffer from the major limitation of catastrophic forgetting old tasks when learning new ones. In this paper we focus on class incremental continual learning in semantic segmentation, where new categories are made available over time while previous training data is not retained. The proposed continual learning scheme shapes the latent space to reduce forgetting whilst improving the recognition of novel classes. Our framework is driven by three novel components which we also combine on top of existing techniques effortlessly. First, prototypes matching enforces latent space consistency on old classes, constraining the encoder to produce similar latent representation for previously seen classes in the subsequent steps. Second, features sparsification allows to make room in the latent space to accommodate novel classes. Finally, contrastive learning is employed to cluster features according to their semantics while tearing apart those of different classes. Extensive evaluation on the Pascal VOC2012 and ADE20K datasets demonstrates the effectiveness of our approach, significantly outperforming state-of-the-art methods.

您可能还喜欢...

Paper