Standard Unsupervised Domain Adaptation (UDA) methods assume the availability of both source and target data during the adaptation. In this work, we investigate the Test-Time Adaptation (TTA), a specific case of UDA where a model is adapted to a target domain without access to source data. We propose a novel approach for the TTA setting based on a loss reweighting strategy that brings robustness against the noise that inevitably affects the pseudo-labels. The classification loss is reweighted based on the reliability of the pseudo-labels that is measured by estimating their uncertainty. Guided by such reweighting strategy, the pseudo-labels are progressively refined by aggregating knowledge from neighbouring samples. Furthermore, a self-supervised contrastive framework is leveraged as a target space regulariser to enhance such knowledge aggregation. A novel negative pairs exclusion strategy is proposed to identify and exclude negative pairs made of samples sharing the same class, even in presence of some noise in the pseudo-labels. Our method outperforms previous methods on three major benchmarks by a large margin. We set the new TTA state-of-the-art on VisDA-C and DomainNet with a performance gain of +1.8\% on both benchmarks and on PACS with +12.3\% in the single-source setting and +6.6\% in\ multi-target adaptation. Additional analyses demonstrate that the proposed approach is robust to the noise, which results in significantly more accurate pseudo-labels compared to state-of-the-art approaches.
翻译:在这项工作中,我们调查了测试时间适应(TTA),这是UDA的一个具体案例,在UDA的一个具体案例中,一个模型被调整成一个目标域,而没有源数据。我们建议了一种基于损失再加权战略的TTA设置新颖办法,该方法能够对不可避免地影响假标签的噪音带来稳健性;分类损失根据通过估计不确定性衡量的假标签的可靠性进行重新加权;在这种重新加权战略的指导下,假标签通过汇集邻近样本的知识逐步得到改进。此外,一个自上而下的对比框架被作为目标空间正规化器,用于加强这种知识汇总。我们建议了一个新的负对子排除战略,以识别和排除在同一类的样品中的负对子,即便在假标签中存在一些噪音。我们的方法比以往的三大主要基准要好得多。我们为 VisDA-C 和 Domainal-al-al-albality 6 和 Domainal-al-al-al-albas-al-al-al-al-al-al-e-al-al-al-al-al-al-al-al-al-al-lation-lation-al-al-al-al-al-al-s-lation 6 6 和Pas-ax-axxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx</s>