Learning good image representations that are beneficial to downstream tasks is a challenging task in computer vision. As such, a wide variety of self-supervised learning approaches have been proposed. Among them, contrastive learning has shown competitive performance on several benchmark datasets. The embeddings of contrastive learning are arranged on a hypersphere that results in using the inner (dot) product as a distance measurement in Euclidean space. However, the underlying structure of many scientific fields like social networks, brain imaging, and computer graphics data exhibit highly non-Euclidean latent geometry. We propose a novel contrastive learning framework to learn semantic relationships in the hyperbolic space. Hyperbolic space is a continuous version of trees that naturally owns the ability to model hierarchical structures and is thus beneficial for efficient contrastive representation learning. We also extend the proposed Hyperbolic Contrastive Learning (HCL) to the supervised domain and studied the adversarial robustness of HCL. The comprehensive experiments show that our proposed method achieves better results on self-supervised pretraining, supervised classification, and higher robust accuracy than baseline methods.
翻译:学习有益于下游任务的良好形象表现是计算机愿景中的一项艰巨任务。 因此,提出了各种各样的自我监督学习方法。 其中,对比式学习在几个基准数据集中表现出了竞争性的成绩。对比式学习的嵌入是在超光谱上安排的,其结果是将内(点)产品用作欧几里德空间的远程测量。然而,许多科学领域的基本结构,如社会网络、脑成像和计算机图形数据,显示出高度非欧几里德潜伏几何体。我们提出了一个新的对比性学习框架,以学习超双向空间的语义关系。双向空间是树木的连续版本,自然拥有建模等级结构的能力,因此有利于高效的对比性教学。我们还将拟议的超单向相对立学习扩大到受监管的领域,并研究了高频频频度高频谱的对抗性研究。全面实验表明,我们提出的方法在自我监督前训练、监督分类和比基线方法更精确。