Pseudo-Labeling has emerged as a simple yet effective technique for semi-supervised object detection (SSOD). However, the inevitable noise problem in pseudo-labels significantly degrades the performance of SSOD methods. Recent advances effectively alleviate the classification noise in SSOD, while the localization noise which is a non-negligible part of SSOD is not well-addressed. In this paper, we analyse the localization noise from the generation and learning phases, and propose two strategies, namely pseudo-label correction and noise-unaware learning. For pseudo-label correction, we introduce a multi-round refining method and a multi-vote weighting method. The former iteratively refines the pseudo boxes to improve the stability of predictions, while the latter smoothly self-corrects pseudo boxes by weighing the scores of surrounding jittered boxes. For noise-unaware learning, we introduce a loss weight function that is negatively correlated with the Intersection over Union (IoU) in the regression task, which pulls the predicted boxes closer to the object and improves localization accuracy. Our proposed method, Pseudo-label Correction and Learning (PCL), is extensively evaluated on the MS COCO and PASCAL VOC benchmarks. On MS COCO, PCL outperforms the supervised baseline by 12.16, 12.11, and 9.57 mAP and the recent SOTA (SoftTeacher) by 3.90, 2.54, and 2.43 mAP under 1\%, 5\%, and 10\% labeling ratios, respectively. On PASCAL VOC, PCL improves the supervised baseline by 5.64 mAP and the recent SOTA (Unbiased Teacherv2) by 1.04 mAP on AP$^{50}$.
翻译:Pseudo-Labeing已成为半监督对象探测的简单而有效的技术(裁军特别联大)。然而,在假标签中不可避免的噪音问题会大大降低裁军特别联大方法的性能。最近的进展有效地缓解了裁军特别联大的分类噪音,而裁军特别联大不可忽略的部分没有很好地处理本地化噪音。在本文中,我们分析了产生和学习阶段的本地化噪音,并提出了两种战略,即假标签价格校正和噪音软件学习。在假标签校正方面,我们采用了多轮式精炼法和多票加权法。前一种反复地改进假箱,以提高预测的稳定性,而后者则通过权衡周围弹箱的分数而顺利地自我校正。对于噪音软件的学习,我们引入了一种与联盟(IoU)的内分流关系负损失权函数,将预测的箱拉近目标并改进本地化精度。我们提出的方法(Peudo-laoleo-laoal ) 1.90 和多票权重 AS-SOL 基准(AS-SOL) 12 Basural-SOL) 和SOL 基准(AS-SOL-SOL-SOL-SOL-SOL-SOL-SOL-SOL-I) 和C-SOL-SOL-SAR-SOL-SAR-SOL-SOL-SO) 和C-SO-SO-SAR-SO-II-S) 和C-SAR-SAR-SAR-SAR-SAR-SAR-II-II-SAR-SAR-SB-SB-S-S-S-II-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-II-S-S-S-S-S-S-S-S-S-S-SAR-SAR-L-L-SAR-SAR-SAR-SA-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S</s>