We address the Unsupervised Domain Adaptation (UDA) problem in image classification from a new perspective. In contrast to most existing works which either align the data distributions or learn domain-invariant features, we directly learn a unified classifier for both domains within a high-dimensional homogeneous feature space without explicit domain adaptation. To this end, we employ the effective Selective Pseudo-Labelling (SPL) techniques to take advantage of the unlabelled samples in the target domain. Surprisingly, data distribution discrepancy across the source and target domains can be well handled by a computationally simple classifier (e.g., a shallow Multi-Layer Perceptron) trained in the original feature space. Besides, we propose a novel generative model norm-VAE to generate synthetic features for the target domain as a data augmentation strategy to enhance classifier training. Experimental results on several benchmark datasets demonstrate the pseudo-labelling strategy itself can lead to comparable performance to many state-of-the-art methods whilst the use of norm-VAE for feature augmentation can further improve the performance in most cases. As a result, our proposed methods (i.e. naive-SPL and norm-VAE-SPL) can achieve new state-of-the-art performance with the average accuracy of 93.4% and 90.4% on Office-Caltech and ImageCLEF-DA datasets, and comparable performance on Digits, Office31 and Office-Home datasets with the average accuracy of 97.2%, 87.6% and 67.9% respectively.
翻译:我们从新的角度处理图像分类中不受监督的域适应(UDA)问题。与大多数现有工作相比,我们直接在一个高维同质特性空间中学习两个域的统一分类器,而没有明显的域适应。为此,我们采用有效的选择性 Psedo-labell (SPL) 技术来利用目标域中未贴标签的样本。令人惊讶的是,源域和目标域的数据分配差异可以通过计算简单分类器(例如浅色多色 Percepron)来很好地处理,而在原始地貌空间中,我们直接学习了两个域的统一分类器。此外,我们提议为目标域制作新的基因化模型规范-VAE生成合成特性,作为数据增强培训的一种战略。几个基准数据集的实验结果显示伪标签战略本身可以导致与许多最新技术方法的可比较性能,同时,在功能增强方面使用规范-VAE(例如,浅色多色多色多色多色多色多色多色多色多色),在多数情况下可以进一步提高业绩。作为结果,我们提议的DFA-% 标准-S-al-al-al-al-al-al-de-al-lax-lax-de-al-al-lax-lax-lax-lax-al-lax-lax-lax-la-la-lax-lax-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-lax-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la-