Conventional domain adaptation methods do not work well when a large gap exists between the source and the target domain. Gradual domain adaptation is one of the approaches to address the problem by leveraging the intermediate domain, which gradually shifts from the source to the target domain. The previous work assumed that the number of the intermediate domains is large and the distance of the adjacent domains is small; hence, the gradual domain adaptation algorithm by self-training with unlabeled datasets was applicable. In practice, however, gradual self-training will fail because the number of the intermediate domains is limited, and the distance of the adjacent domains is large. We propose using normalizing flows to mitigate this problem while maintaining the framework of unsupervised domain adaptation. We generate pseudo intermediate domains from normalizing flows and then use them for gradual domain adaptation. We evaluate our method by experiments with real-world datasets and confirm that our proposed method mitigates the above explained problem and improves the classification performance.
翻译:当源与目标领域之间存在巨大差距时,常规域适应方法效果不佳。渐进域适应是利用中间域来解决这一问题的一种办法,中间域逐渐从源向目标域转移。先前的工作假设中间域的数目很大,相邻域的距离较小;因此,适用通过使用无标签数据集进行自我培训的渐进域适应算法。但在实践中,由于中间域的数目有限,而邻近地区域的距离很大,逐步自我培训将失败。我们提议使用正常流来缓解这一问题,同时保持不受监督域适应的框架。我们从正常流中产生假中间域,然后将其用于逐步调整领域。我们用现实世界数据集来评估我们的方法,并证实我们提出的方法减轻了上述问题,改进了分类性能。