Standard domain adaptation methods do not work well when a large gap exists between the source and target domains. Gradual domain adaptation is one of the approaches used to address the problem. It involves leveraging the intermediate domain, which gradually shifts from the source domain to the target domain. The previous work assumed that the number of intermediate domains is large and the distance between adjacent domains is small; hence, the gradual domain adaptation algorithm, involving self-training with unlabeled datasets, was applicable. In practice, however, gradual self-training will fail because the number of intermediate domains is limited and the distance between adjacent domains is large. We propose the use of normalizing flows to deal with this problem while maintaining the framework of unsupervised domain adaptation. We generate pseudo intermediate domains from normalizing flows and then use them for gradual domain adaptation. We evaluate our proposed method by experiments with real-world datasets and confirm that it mitigates the above-explained problem and improves the classification performance.
翻译:当源域和目标域之间存在巨大差距时,标准域适应方法效果不好。 渐进域适应是解决问题的方法之一。 它涉及利用中间域,它逐渐从源域向目标域转移。 以前的工作假设中间域的数目很大,相邻域之间的距离很小;因此,可以适用渐进域适应算法,其中包括使用未贴标签的数据集进行自我培训。然而,在实践中,由于中间域的数目有限,邻近地区之间的距离很大,逐步自我培训将失败。我们提议使用正常流来处理这一问题,同时保持不受监督域适应的框架。我们从正常流动中产生假中间域,然后将其用于渐进域适应。我们用真实世界数据集的实验来评估我们拟议的方法,并证实它减轻了上述问题,改进了分类性能。