Test-time adaptation is a special setting of unsupervised domain adaptation where a trained model on the source domain has to adapt to the target domain without accessing source data. We propose a novel way to leverage self-supervised contrastive learning to facilitate target feature learning, along with an online pseudo labeling scheme with refinement that significantly denoises pseudo labels. The contrastive learning task is applied jointly with pseudo labeling, contrasting positive and negative pairs constructed similarly as MoCo but with source-initialized encoder, and excluding same-class negative pairs indicated by pseudo labels. Meanwhile, we produce pseudo labels online and refine them via soft voting among their nearest neighbors in the target feature space, enabled by maintaining a memory queue. Our method, AdaContrast, achieves state-of-the-art performance on major benchmarks while having several desirable properties compared to existing works, including memory efficiency, insensitivity to hyper-parameters, and better model calibration. Project page: sites.google.com/view/adacontrast.
翻译:测试时间适应是未经监督的域适应的特殊设置, 源域上经过培训的模型必须在不访问源数据的情况下适应目标域。 我们提出一种新的方法, 利用自我监督的对比性学习来便利目标特征学习, 以及在线假标签计划, 其精细化显著地隐蔽了假标签。 对比式学习任务与假标签同时应用, 对比正对对对对对对对对对对对, 类似模拟标签, 但与源初始编码对比, 并排除假标签显示的同一类负对。 同时, 我们通过在目标特征空间的近邻之间进行软投票, 保持记忆队列, 从而得以在网上制作假标签并加以改进。 我们的方法, Ada Contrast, 在主要基准上取得了最新业绩, 与现有工程相比, 包括记忆效率, 对超参数不敏感, 以及更好的模型校准。 项目页面: 站点 gougle. com/ view/ adcontrastratt。