Deep neural networks (DNNs) often perform poorly in the presence of domain shift and category shift. How to upcycle DNNs and adapt them to the target task remains an important open problem. Unsupervised Domain Adaptation (UDA), especially recently proposed Source-free Domain Adaptation (SFDA), has become a promising technology to address this issue. Nevertheless, existing SFDA methods require that the source domain and target domain share the same label space, consequently being only applicable to the vanilla closed-set setting. In this paper, we take one step further and explore the Source-free Universal Domain Adaptation (SF-UniDA). The goal is to identify "known" data samples under both domain and category shift, and reject those "unknown" data samples (not present in source classes), with only the knowledge from standard pre-trained source model. To this end, we introduce an innovative global and local clustering learning technique (GLC). Specifically, we design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes and introduce a local k-NN clustering strategy to alleviate negative transfer. We examine the superiority of our GLC on multiple benchmarks with different category shift scenarios, including partial-set, open-set, and open-partial-set DA. Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8\% on the VisDA benchmark. The code is available at https://github.com/ispc-lab/GLC.
翻译:深度神经网络(DNN)在域变换和类别变换时往往表现不佳。 如何更新 DNN(SF-UniDA)和使其适应目标任务仍然是一个重要的未解决的问题。 不受监督的 Doffain Adimation (UDA), 特别是最近提出的无源域调整(SFDA), 已经成为解决这一问题的有希望的技术。 然而, 现有的 SFDA 方法要求源域和目标域共享相同的标签空间, 因此只适用于香草闭路设置。 在本文中, 我们进一步一步, 并探索无源通用Donform Domain 适应(SF- UniDA) (SF- UnidiDA) 。 目标是在域和类别转换中找出“已知的”数据样本, 拒绝那些“ 未知的” 数据样本(不在源类中 ) 。 我们为此采用了创新的全球和本地 GLA 群集学习技术(GLC) 。 具体地, 我们设计了一个新的、 适应性的一至五种全球群集的开放式全球群集算,, 在不同的目标类中引入 k- NNNDODG- s- reformax- reformax- s</s>