Whilst contrastive learning has recently brought notable benefits to deep clustering of unlabelled images by learning sample-specific discriminative visual features, its potential for explicitly inferring class decision boundaries is less well understood. This is because its instance discrimination strategy is not class sensitive, therefore, the clusters derived on the resulting sample-specific feature space are not optimised for corresponding to meaningful class decision boundaries. In this work, we solve this problem by introducing Semantic Contrastive Learning (SCL). SCL imposes explicitly distance-based cluster structures on unlabelled training data by formulating a semantic (cluster-aware) contrastive learning objective. Moreover, we introduce a clustering consistency condition to be satisfied jointly by both instance visual similarities and cluster decision boundaries, and concurrently optimising both to reason about the hypotheses of semantic ground-truth classes (unknown/unlabelled) on-the-fly by their consensus. This semantic contrastive learning approach to discovering unknown class decision boundaries has considerable advantages to unsupervised learning of object recognition tasks. Extensive experiments show that SCL outperforms state-of-the-art contrastive learning and deep clustering methods on six object recognition benchmarks, especially on the more challenging finer-grained and larger datasets.
翻译:对比性学习最近通过学习针对具体样本的歧视性视觉特征,给未贴标签的图像的深度组合带来了显著的好处,但是其明确推断阶级决定界限的可能性却不那么为人们所理解。这是因为其实例歧视战略并不敏感,因此,由此得出的特定样本特性空间的组群并不最适合与有意义的阶级决定界限相对应。在这项工作中,我们通过引入语义矛盾学习(SCL)来解决这个问题。SCL通过制定一种语义(集群)对比学习目标,在未贴标签的培训数据上明确设置基于远程的群集结构。此外,我们引入了一个群集一致性条件,以便通过实例相似性和群集决定界限的界限来共同满足。因此,从由此产生的特定样本特征空间中得出的组群群并不最适合于与有意义的阶级决定界限相对应。我们在这项工作中,通过采用语义矛盾的学习方法来发现未知的阶级决定界限,这对目标识别任务有不超超常的学习优势。广泛的实验显示,在六类对比性基准上,SCloverforformas 特别具有挑战性的对比性的数据学习方法。