Seed area generation is usually the starting point of weakly supervised semantic segmentation (WSSS). Computing the Class Activation Map (CAM) from a multi-label classification network is the de facto paradigm for seed area generation, but CAMs generated from Convolutional Neural Networks (CNNs) and Transformers are prone to be under- and over-activated, respectively, which makes the strategies to refine CAMs for CNNs usually inappropriate for Transformers, and vice versa. In this paper, we propose a Unified optimization paradigm for Seed Area GEneration (USAGE) for both types of networks, in which the objective function to be optimized consists of two terms: One is a generation loss, which controls the shape of seed areas by a temperature parameter following a deterministic principle for different types of networks; The other is a regularization loss, which ensures the consistency between the seed areas that are generated by self-adaptive network adjustment from different views, to overturn false activation in seed areas. Experimental results show that USAGE consistently improves seed area generation for both CNNs and Transformers by large margins, e.g., outperforming state-of-the-art methods by a mIoU of 4.1% on PASCAL VOC. Moreover, based on the USAGE-generated seed areas on Transformers, we achieve state-of-the-art WSSS results on both PASCAL VOC and MS COCO.
翻译:种子区生成通常是受监管薄弱的语义分割(WSSS)的起点。从多标签分类网络中计算种子区生成的分类激活图(CAM)是种子区生成的事实上的范例,但是,由进化神经网络(CNNs)和变异器生成的CAMs往往往往活动不足和过多,这使得改进CNN的CAMs的战略通常不适合变异器,反之亦然。在本文中,我们提出了两种网络的种子区域激活统一优化模式(USAGE),其中目标功能由两个术语组成:一个是一代损失,根据不同类型网络的确定性原则,通过温度参数控制种子区的形状;另一个是正常损失,确保由自适应网络调整产生的种子区之间的一致性,从而推翻种子区的虚假激活。实验结果显示,USAGEAGE持续改善CNNs和变异变器的种子区生成,通过大边距,e-AGAGE-O-O-SAF-O-U-SAR-SAR-SAR-SAR-SUAVAMAL MA-O-SAL-SAL-SAL-SAL-SAL-SAL-SAL-SAL-SAL-SAL-SUI-SAL-SAL-SAL-SAL-MA-S-O-SAL-S-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MAL-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-</s>