Unsupervised out-of-distribution (OOD) Detection aims to separate the samples falling outside the distribution of training data without label information. Among numerous branches, contrastive learning has shown its excellent capability of learning discriminative representation in OOD detection. However, for its limited vision, merely focusing on instance-level relationship between augmented samples, it lacks attention to the relationship between samples with same semantics. Based on the classic contrastive learning, we propose Cluster-aware Contrastive Learning (CCL) framework for unsupervised OOD detection, which considers both instance-level and semantic-level information. Specifically, we study a cooperation strategy of clustering and contrastive learning to effectively extract the latent semantics and design a cluster-aware contrastive loss function to enhance OOD discriminative ability. The loss function can simultaneously pay attention to the global and local relationships by treating both the cluster centers and the samples belonging to the same cluster as positive samples. We conducted sufficient experiments to verify the effectiveness of our framework and the model achieves significant improvement on various image benchmarks.
翻译:未经监督的分布(OOD)检测旨在将没有标签信息的培训数据分发范围以外的样本分离出来。在许多分支中,对比性学习表明,在OOD检测中,它具有学习歧视性代表性的极好能力;然而,由于视野有限,仅仅侧重于增加样本之间的实例级关系,它缺乏对同一语义学样本之间关系的重视。根据典型的对比性学习,我们建议对未经监督的 OOD检测采用集束-有意识差异学习(CCL)框架,这一框架既考虑案例级信息,也考虑语义级信息。具体地说,我们研究集群和对比性学习的合作战略,以有效提取潜在的语义学,并设计一个集群识别的对比性损失功能,以加强OOD的区别性能力。损失功能可以通过将集群中心和属于同一组的样本作为正样,同时关注全球和地方关系。我们进行了充分的实验,以核实我们框架的有效性,模型在各种图像基准上取得了显著的改进。