Semi-supervised domain adaptation (SSDA) aims to adapt models trained from a labeled source domain to a different but related target domain, from which unlabeled data and a small set of labeled data are provided. Current methods that treat source and target supervision without distinction overlook their inherent discrepancy, resulting in a source-dominated model that has not effectively used the target supervision. In this paper, we argue that the labeled target data needs to be distinguished for effective SSDA, and propose to explicitly decompose the SSDA task into two sub-tasks: a semi-supervised learning (SSL) task in the target domain and an unsupervised domain adaptation (UDA) task across domains. By doing so, the two sub-tasks can better leverage the corresponding supervision and thus yield very different classifiers. To integrate the strengths of the two classifiers, we apply the well-established co-training framework, in which the two classifiers exchange their high confident predictions to iteratively "teach each other" so that both classifiers can excel in the target domain. We call our approach Deep Co-training with Task decomposition (DeCoTa). DeCoTa requires no adversarial training and is easy to implement. Moreover, DeCoTa is well-founded on the theoretical condition of when co-training would succeed. As a result, DeCoTa achieves state-of-the-art results on several SSDA datasets, outperforming the prior art by a notable 4% margin on DomainNet. Code is available at https://github.com/LoyoYang/DeCoTa
翻译:半监督域适应(SDA) 旨在将从标签源域培训的模型从标签源域到不同的但相关的目标域,从中提供未标签的数据和少量标签数据。目前处理源和目标监督的方法不加区别地忽略其内在差异,从而形成一种以源为主的模式,没有有效地使用目标监督。在本文中,我们争辩说,标签目标数据需要为有效的SDA而区分,并提议明确将SDA任务分解为两个子任务:目标域的半监督学习任务和无监督域域域调整任务。通过这样做,两个子任务可以更好地利用相应的监督,从而产生非常不同的分类。为了整合两个分类器的优势,我们应用了完善的联合培训框架,让两个分类者将其高度自信的预测与迭代为“相互教学”,以便两个分类者都能在目标域内完成。我们称之为“深跟踪”的学习任务和“UDA”的边域调整任务。在TA-CO 之前的测试结果是“DO-C-Co-deal-deal-trainal ”中, DeCo-deal-dealtrading a recal detrading a ex decal detravelyal detailtal detail detail detail detaildal detament a detament a de de detail detament s detail detament a de detament detamental de de de detamental detamental detailtal detail detail detamental detamental detail detamental detail detail detamental detail detaments.