Deep neural networks have a clear degradation when applying to the unseen environment due to the covariate shift. Conventional approaches like domain adaptation requires the pre-collected target data for iterative training, which is impractical in real-world applications. In this paper, we propose to adapt the deep models to the novel environment during inference. An previous solution is test time normalization, which substitutes the source statistics in BN layers with the target batch statistics. However, we show that test time normalization may potentially deteriorate the discriminative structures due to the mismatch between target batch statistics and source parameters. To this end, we present a general formulation $\alpha$-BN to calibrate the batch statistics by mixing up the source and target statistics for both alleviating the domain shift and preserving the discriminative structures. Based on $\alpha$-BN, we further present a novel loss function to form a unified test time adaptation framework Core, which performs the pairwise class correlation online optimization. Extensive experiments show that our approaches achieve the state-of-the-art performance on total twelve datasets from three topics, including model robustness to corruptions, domain generalization on image classification and semantic segmentation. Particularly, our $\alpha$-BN improves 28.4\% to 43.9\% on GTA5 $\rightarrow$ Cityscapes without any training, even outperforms the latest source-free domain adaptation method.
翻译:深心神经网络在应用到不可见的环境时会明显退化。 常规方法( 如域适应) 要求为迭代培训提供预收集的目标数据, 这在现实世界应用中是不切实际的。 在本文中, 我们提议将深度模型在推论期间适应新环境。 先前的解决方案是测试时间正常化, 以目标批量统计数据取代 BN 层的源统计数据。 然而, 我们显示测试时间正常化可能会由于目标批量统计数据和源参数的错配而使歧视结构恶化。 为此, 我们提出了一个通用的公式 $\ alpha$- BN 来校准批次统计数据, 以校准源和目标统计数据, 以缓解域转移和保存歧视结构。 基于$alpha- BN, 我们进一步提出一个新的损失功能, 以形成统一的测试时间调整框架核心, 以进行双向阶级在线相关优化。 广泛的实验显示, 我们的方法在总共12个数据集中实现了状态性表现, 从三个主题, 包括模型坚固度到腐败, 域价$$- gnalalalalal- greal- crealtractionalalalalalation, 在图像分类和Saltraction 4xal__ acal- haintraction_ axegal_ 任何Gration_____ axxxxyalxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx