We tackle the problem of domain adaptation in object detection, where there is a significant domain shift between a source (a domain with supervision) and a target domain (a domain of interest without supervision). As a widely adopted domain adaptation method, the self-training teacher-student framework (a student model learns from pseudo labels generated from a teacher model) has yielded remarkable accuracy gain on the target domain. However, it still suffers from the large amount of low-quality pseudo labels (e.g., false positives) generated from the teacher due to its bias toward the source domain. To address this issue, we propose a self-training framework called Adaptive Unbiased Teacher (AUT) leveraging adversarial learning and weak-strong data augmentation during mutual learning to address domain shift. Specifically, we employ feature-level adversarial training in the student model, ensuring features extracted from the source and target domains share similar statistics. This enables the student model to capture domain-invariant features. Furthermore, we apply weak-strong augmentation and mutual learning between the teacher model on the target domain and the student model on both domains. This enables the teacher model to gradually benefit from the student model without suffering domain shift. We show that AUT demonstrates superiority over all existing approaches and even Oracle (fully supervised) models by a large margin. For example, we achieve 50.9% (49.3%) mAP on Foggy Cityscape (Clipart1K), which is 9.2% (5.2%) and 8.2% (11.0%) higher than previous state-of-the-art and Oracle, respectively
翻译:在目标检测中,我们解决了目标检测中的域适应问题,即源(有监督的域)和目标领域(没有监督的域)之间有很大的域变化。作为一个广泛采用的域适应方法,自我培训师学生框架(学生模型从教师模型产生的假标签中学习)在目标领域取得了显著的准确性收益。但是,由于教师偏向源领域,因此产生了大量低质量的假标签(例如假正数),因此,在目标领域存在偏向源领域的偏向性,因此我们提出了一个自培训框架,称为适应型无偏见教师(AUT),在共同学习期间利用对抗性学习和弱强数据增强来应对域变化。具体地说,我们在学生模型中采用特级对抗性培训,确保从源和目标领域提取的特征能够分享类似的统计数据。此外,在目标领域和学生模型中,我们应用了微弱性增强性和相互学习(1级增长),在目标领域和两个领域的学生模型中,我们提出了自适应性增强和相互学习框架(AUT),使教师模型在相互进行对抗性学习时,甚至使50级(C)优度的优度学习方法能够从现有的大学模型上展示一个甚优度,我们不进行。