We address the problem of few-shot semantic segmentation (FSS), which aims to segment novel class objects in a target image with a few annotated samples. Though recent advances have been made by incorporating prototype-based metric learning, existing methods still show limited performance under extreme intra-class object variations and semantically similar inter-class objects due to their poor feature representation. To tackle this problem, we propose a dual prototypical contrastive learning approach tailored to the FSS task to capture the representative semanticfeatures effectively. The main idea is to encourage the prototypes more discriminative by increasing inter-class distance while reducing intra-class distance in prototype feature space. To this end, we first present a class-specific contrastive loss with a dynamic prototype dictionary that stores the class-aware prototypes during training, thus enabling the same class prototypes similar and the different class prototypes to be dissimilar. Furthermore, we introduce a class-agnostic contrastive loss to enhance the generalization ability to unseen classes by compressing the feature distribution of semantic class within each episode. We demonstrate that the proposed dual prototypical contrastive learning approach outperforms state-of-the-art FSS methods on PASCAL-5i and COCO-20i datasets. The code is available at:https://github.com/kwonjunn01/DPCL1.
翻译:尽管最近通过采用基于原型的计量学习取得了一些进展,但现有方法在极端的类内物体变异和语义上相似的类间物体中的表现仍然有限,因为其特征表现较差。为了解决这一问题,我们建议针对FSS的任务采用一种双重的原型反比性学习方法,以有效捕捉具有代表性的语义特征。主要想法是鼓励原型更加具有歧视性,增加类间距离,同时减少原型地貌空间的类内距离。为此,我们首先用动态的原型字典展示一个因阶级而异的对比性损失,在培训期间储存类内物体原型,从而使同类和不同类原型的同类原型变得不相像。此外,我们引入了一种类性非典型的对比性学习方法,通过对每集成的语系间距离分布进行压缩,从而增强普通类的普通化能力。我们证明,拟议的二类类类类类类类类类类类类反比性研究方法是:在FSIS-20-SAFAS-SAFAS-SAFSAS-CSAS-SAS-SAS-SAS-SAS-SAS-SAC-SAC-SAS-SD-C-C-C-C-C-C-C-SAR-C-C-C-C-C-C-C-SD-C-SD-C-C-C-C-SD-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-D-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C