This paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves a ranked ordering of positive samples. In contrast to the standard InfoNCE loss, which requires a strict binary separation of the training pairs into similar and dissimilar samples, RINCE can exploit information about a similarity ranking for learning a corresponding embedding space. We show that the proposed loss function learns favorable embeddings compared to the standard InfoNCE whenever at least noisy ranking information can be obtained or when the definition of positives and negatives is blurry. We demonstrate this for a supervised classification task with additional superclass labels and noisy similarity scores. Furthermore, we show that RINCE can also be applied to unsupervised training with experiments on unsupervised representation learning from videos. In particular, the embedding yields higher classification accuracy, retrieval rates and performs better in out-of-distribution detection than the standard InfoNCE loss.
翻译:本文介绍排名InfoNCE损失家族的新成员Rince Indio Noise Contracting Expressionation(RINCE),这是InfoNCE损失家族的新成员,保留了正样排序顺序。与标准的InfoNCE损失相比,该损失要求严格将培训配对分解成相似和不同的样本,RINCE可以利用关于相似等级的信息学习相应的嵌入空间。我们显示,拟议的损失函数与标准InfoNCE相比,在至少可以获得吵闹的排名信息时,或在确定正值和负值时,与标准的信息NCEE相比,可以学习更有利的嵌入式嵌入。我们展示的是,在有额外超级类标签和吵闹相似分数的监督下的分类任务中,我们展示的是,RINCE也可以应用在未经严格监督的训练中,实验非超超级代表制的学习视频。特别是,嵌入的分类精确度、检索率和在分配外检测方面表现优于标准的InfoNCE损失。