Semi-supervised learning, i.e., training networks with both labeled and unlabeled data, has made significant progress recently. However, existing works have primarily focused on image classification tasks and neglected object detection which requires more annotation effort. In this work, we revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD. To address this, we introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner. Together with a class-balance loss to downweight overly confident pseudo-labels, Unbiased Teacher consistently improved state-of-the-art methods by significant margins on COCO-standard, COCO-additional, and VOC datasets. Specifically, Unbiased Teacher achieves 6.8 absolute mAP improvements against state-of-the-art method when using 1% of labeled data on MS-COCO, achieves around 10 mAP improvements against the supervised baseline when using only 0.5, 1, 2% of labeled data on MS-COCO.
翻译:半监督的学习,即使用标签和未标签数据的培训网络,最近取得了显著进展。然而,现有工作主要侧重于图像分类任务和被忽视的天体探测,需要做更多的批注。在这项工作中,我们重新审视半监督对象探测(SS-OD),并查明SS-OD的假标签偏见问题。为了解决这个问题,我们引入了无偏见教师,这是一种简单而有效的方法,以相互受益的方式联合培训学生和逐步提高教师。同时,由于COCO标准、CO-addition、CO-additional和VOC数据集的显著利润,课堂平衡损失降到了过重的过度自信的伪标签,无偏见的教师持续改进了最新方法。具体地说,无偏见教师在使用MS-CO的标签数据中只有0.5、1%和2%的MS-CO数据时,与最先进的方法相比,实现了6.8绝对的 mAP改进。