In this work, we revisit the prior mask guidance proposed in "Prior Guided Feature Enrichment Network for Few-Shot Segmentation". The prior mask serves as an indicator that highlights the region of interests of unseen categories, and it is effective in achieving better performance on different frameworks of recent studies. However, the current method directly takes the maximum element-to-element correspondence between the query and support features to indicate the probability of belonging to the target class, thus the broader contextual information is seldom exploited during the prior mask generation. To address this issue, first, we propose the Context-aware Prior Mask (CAPM) that leverages additional nearby semantic cues for better locating the objects in query images. Second, since the maximum correlation value is vulnerable to noisy features, we take one step further by incorporating a lightweight Noise Suppression Module (NSM) to screen out the unnecessary responses, yielding high-quality masks for providing the prior knowledge. Both two contributions are experimentally shown to have substantial practical merit, and the new model named PFENet++ significantly outperforms the baseline PFENet as well as all other competitors on three challenging benchmarks PASCAL-5$^i$, COCO-20$^i$ and FSS-1000. The new state-of-the-art performance is achieved without compromising the efficiency, manifesting the potential for being a new strong baseline in few-shot semantic segmentation. Our code will be available at https://github.com/dvlab-research/PFENet++.
翻译:在这项工作中,我们重新审视了先前在“小片片段原始向导地貌强化网络”中建议的“前掩码”指南。前掩码是一个指标,它突出显示隐蔽类别中的利益区域,并有效地改善了最近研究不同框架的绩效。然而,目前的方法直接采用了查询和支助功能之间最大元素对元素的通信,以表明属于目标类别的可能性,因此在前一代掩码中很少利用更广泛的背景信息。为了解决这一问题,首先,我们提议使用上层显示的上层显示器(CAPM),利用附近更多隐喻提示更好地定位查询图像中的对象。第二,由于最大关联值易受到最近研究不同框架的噪音特性的影响,我们进一步采取了一种步骤,即采用轻度噪音抑制模块来筛选不必要的反应,为提供先前知识提供高质量的遮罩。两种贡献都具有巨大的实际价值,而名为PFENet+的新模型大大超越了PFE-20网络的基准,以及所有其它强的对手,在三个基准中,即PSCA-5-S-SAL-S-S-S-S-SAL-SAL-SL-S-S-SAL-SL-SL-SAL-SAL-SAL-S-S-SL-SL-S-SL-SL-S-Slal-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-Sl-S-S-S-S-S-S-S-S-SL-Sl-SL-SL-S-S-S-S-Sl-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-SAL-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S