Domain adaptation (DA) aims to transfer knowledge from a label-rich and related domain (source domain) to a label-scare domain (target domain). Pseudo-labeling has recently been widely explored and used in DA. However, this line of research is still confined to the inaccuracy of pseudo-labels. In this paper, we reveal an interesting observation that the target samples belonging to the classes with larger domain shift are easier to be misclassified compared with the other classes. These classes are called hard class, which deteriorates the performance of DA and restricts the applications of DA. We propose a novel framework, called Hard Class Rectification Pseudo-labeling (HCRPL), to alleviate the hard class problem from two aspects. First, as is difficult to identify the target samples as hard class, we propose a simple yet effective scheme, named Adaptive Prediction Calibration (APC), to calibrate the predictions of the target samples according to the difficulty degree for each class. Second, we further consider that the predictions of target samples belonging to the hard class are vulnerable to perturbations. To prevent these samples to be misclassified easily, we introduce Temporal-Ensembling (TE) and Self-Ensembling (SE) to obtain consistent predictions. The proposed method is evaluated in both unsupervised domain adaptation (UDA) and semi-supervised domain adaptation (SSDA). The experimental results on several real-world cross-domain benchmarks, including ImageCLEF, Office-31 and Office-Home, substantiates the superiority of the proposed method.
翻译:域适应( DA) 旨在将知识从标签丰富和相关域域( 源域) 转移到标签护理域( 目标域) 。 最近,Peeudo标签被广泛探讨并用于DA。 但是,这一研究线仍然局限于伪标签的不准确性。 在本文中,我们揭示了一个有趣的观察,即属于大域变换类别的目标样品较易与其他类别相比被错误分类。这些类别被称为硬级,这恶化了DA的跨级性能并限制了DA的应用。我们提出了一个新颖的框架,称为硬级校正校正化普塞多标签(HCRPL),以从两个方面缓解硬级问题。第一,由于很难将目标样品确定为硬级,因此我们提出了一个简单而有效的方案,名为适应性预测(APC),根据每类的难度校正度对目标样品的预测进行校正。 其次,我们进一步认为,属于硬级组的目标样品的预测值很容易被破坏 。 (ANSB ), 内部域域域预测,, 包括 内部 内部, (ERS ) 的,,,,,,, 和 等 的, 的, 等 的, 等, 的,,,,,,, 等,, 等 等 等,,,,,,, 等,,,,, 等 等,, 等,,,,,,,,,,, 等,,,,, 等 等 等 等 等,,,,,,,,,,,,,,,,,,,,,,,,,,, 等,,,,,,,,,,,,,,,, 等,,, 等 等 等,, 等,,, 等,,