Target domain pseudo-labelling has shown effectiveness in unsupervised domain adaptation (UDA). However, pseudo-labels of unlabeled target domain data are inevitably noisy due to the distribution shift between source and target domains. This paper proposes a Generative model-based Noise-Robust Training method (GeNRT), which eliminates domain shift while mitigating label noise. GeNRT incorporates a Distribution-based Class-wise Feature Augmentation (D-CFA) and a Generative-Discriminative classifier Consistency (GDC), both based on the class-wise target distributions modelled by generative models. D-CFA minimizes the domain gap by augmenting the source data with distribution-sampled target features, and trains a noise-robust discriminative classifier by using target domain knowledge from the generative models. GDC regards all the class-wise generative models as generative classifiers and enforces a consistency regularization between the generative and discriminative classifiers. It exploits an ensemble of target knowledge from all the generative models to train a noise-robust discriminative classifier and eventually gets theoretically linked to the Ben-David domain adaptation theorem for reducing the domain gap. Extensive experiments on Office-Home, PACS, and Digit-Five show that our GeNRT achieves comparable performance to state-of-the-art methods under single-source and multi-source UDA settings.
翻译:目标域伪标签在不受监督的域适应(UDA)中显示了有效性。然而,由于源域和目标域之间的分布转移,未贴标签目标域数据的假标签不可避免地会变得吵闹。本文件建议采用基于发号标本的噪音-紫色培训方法(GeNRT),在减少标签噪音的同时消除域转移。 GENRT采用了基于分布的分类的分类特性增强(D-CFA)和基于以基因化模型模型模型模拟的类别-区别分类的一致性(GDC),未贴标签的目标域数据的假标签不可避免地会因源域域域分布的改变而变得吵闹杂乱。D-CFA通过使用分发标本的目标域知识来增加源数据,从而尽可能缩小域间差距。 GDC将所有类别错误的基因化模型作为基因化解析(D-CFFFA), 利用所有基因化模型的目标知识集合,从所有基因化模型到可比较的源域的分布-Ben-BAR-FD-BAR-S-BAR-BAR-BAR-BAR-BS-BAR-BS-BAR-BAR-BAR-BAR-BS-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BS-BAR-BS-S-BAR-BS-BAR-BS-B-BS-BAR-BAR-BS-BS-BS-S-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-BAR-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-BAR-BAR-C-BAR-BAR-C-C-C-C-C-BAR-C-C-C-C-C-C-C-C-C-C-C-BAR-BAR-C-C-C-C-C-C-C-C-C-</s>