Domain generalization (DG) aims to learn a model on one or more different but related source domains that could be generalized into an unseen target domain. Existing DG methods try to prompt the diversity of source domains for the model's generalization ability, while they may have to introduce auxiliary networks or striking computational costs. On the contrary, this work applies the implicit semantic augmentation in feature space to capture the diversity of source domains. Concretely, an additional loss function of distance metric learning (DML) is included to optimize the local geometry of data distribution. Besides, the logits from cross entropy loss with infinite augmentations is adopted as input features for the DML loss in lieu of the deep features. We also provide a theoretical analysis to show that the logits can approximate the distances defined on original features well. Further, we provide an in-depth analysis of the mechanism and rational behind our approach, which gives us a better understanding of why leverage logits in lieu of features can help domain generalization. The proposed DML loss with the implicit augmentation is incorporated into a recent DG method, that is, Fourier Augmented Co-Teacher framework (FACT). Meanwhile, our method also can be easily plugged into various DG methods. Extensive experiments on three benchmarks (Digits-DG, PACS and Office-Home) have demonstrated that the proposed method is able to achieve the state-of-the-art performance.
翻译:空域一般化( DG) 旨在学习一个或更多不同但相关源域的模型,这些模型可以推广到一个看不见的目标域; 现有的 DG 方法试图为模型的概括能力促进源域的多样性,同时可能需要引入辅助网络或计算成本; 相反, 这项工作在特性空间应用隐含的语义扩增来捕捉源域的多样性; 具体地说, 将远程计量学习的额外损失功能( DML) 纳入到数据分布的本地几何测量中, 以优化数据分布的本地几何。 此外, 将具有无限扩增功能的交叉通缩损失的日志作为DML损失的输入功能, 以取代深度特性。 我们还提供理论分析, 以表明日志能够接近原始特性所定义的距离。 此外, 我们对功能空间内隐含的日志功能增缩的日志功能增加功能增加功能增加。 拟议的DGDG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DFACT- DF- DF- DG- DF- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DG- DF- DG- DG- DG- DG- DG- DG- DG- DG- DG- D- D- 3 3 方法, 和D- DG- DG- DG- DG-D-D-D-D-D-D- DG- DG-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-DG-DG-DG-D-D- 3-DAS-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D