The standard contrastive learning acts on the extracted features with $\ell_{2}$ normalization. For domain adaptation tasks, however, we find that contrastive learning with the standard paradigm does not perform well. The reason is mainly that the class weights (weights of the final fully connected layer) are not involved during optimization, which does not guarantee the produced features to be clustered around the class weights learned from source data. To tackle this issue, we propose a simple yet powerful probabilistic contrastive learning (PCL) in this paper, which not only produces compact features but also enforces them to be distributed around the class weights. Specifically, we break the traditional contrastive learning paradigm (feature+$\ell_{2}$ normalization) by replacing the features with probabilities and removing $\ell_{2}$ normalization. In this way, we can enforce the probability to approximate the one-hot form, thereby narrowing the distance between the features and the class weights. PCL is generic due to conciseness, which can be used for different tasks. In this paper, we conduct extensive experiments on five tasks, \textit{i.e.}, unsupervised domain adaptation (UDA), semi-supervised domain adaptation (SSDA), semi-supervised learning (SSL), UDA detection, and UDA semantic segmentation. The results demonstrate that our PCL can bring significant gains for these tasks. In particular, for segmentation tasks, with the blessing of PCL, our method achieves or even surpasses CPSL-D with a smaller training cost (1*3090, 5 days vs 4*V100, 11 days). Code is available at https://github.com/ljjcoder/Probabilistic-Contrastive-Learning.
翻译:以 $\ ell\ \ \ \ \ 2} 正常化的抽取功能上的标准对比学习行为。 然而, 对于域适应任务, 我们发现与标准范式对比的学习效果不佳。 原因主要是在优化过程中没有涉及到类加权( 最后完全连接层的重量), 这并不能保证生成的特征围绕从源数据中学习的类加权。 为了解决这个问题, 我们提议在本文件中采用简单而有力的概率对比学习( PCL ), 它不仅产生缩缩缩功能, 而且还强制将其分布在班级重量上。 具体地说, 我们打破传统的对比学习模式( 能力+$\ ell\ \ \ \ \ \ \ = = = = 正常化 ) 。 这样, 我们可以将生成的功能与级比重相近, 从而缩小特性与级比值之间的距离。 PCLL是通用的, 用于不同的任务。 在本文中, 我们对五项任务进行广泛的实验, 快速的精度,\ trainalvision laveal- laveal laveal laveal acation ( acreal deal decal divate), ( lax acredududududududududududududududududududududududududududucald.