Source-free domain adaptation, where only a pre-trained source model is used to adapt to the target distribution, is a more general approach to achieving domain adaptation. However, it can be challenging to capture the inherent structure of the target features accurately due to the lack of supervised information on the target domain. To tackle this problem, we propose a novel approach called Adaptive Local Transfer (ALT) that tries to achieve efficient feature clustering from the perspective of label propagation. ALT divides the target data into inner and outlier samples based on the adaptive threshold of the learning state, and applies a customized learning strategy to best fits the data property. Specifically, inner samples are utilized for learning intra-class structure thanks to their relatively well-clustered properties. The low-density outlier samples are regularized by input consistency to achieve high accuracy with respect to the ground truth labels. In this way, local clustering can be prevented from forming spurious clusters while effectively propagating label information among subpopulations. Empirical evidence demonstrates that ALT outperforms the state of the arts on three public benchmarks: Office-31, Office-Home, and VisDA.
翻译:由于缺少目标领域的监督信息,因此精确地捕捉目标特征的内在结构可能具有挑战性。为了解决这一问题,我们提议采用名为适应性地方转移(ALT)的新办法,从标签传播的角度,力求实现高效的特征集群。低排放点将目标数据分为以学习状态适应性临界值为基础的内部和外部样本,并采用定制学习战略,以最符合数据属性。具体地说,低密度外部样本用于学习类内结构,因为其相对集中的特性。低密度外部样本通过投入一致性进行常规化,以达到地面真相标签的高精确度。通过这种方式,可以防止本地集群形成煽动性的集群,同时在亚群体中有效传播标签信息。Epirical证据表明,低排放点超越了三大公共基准:办公室31、办公室Home和VisDA。