We present a new framework for self-supervised representation learning by formulating it as a ranking problem in an image retrieval context on a large number of random views (augmentations) obtained from images. Our work is based on two intuitions: first, a good representation of images must yield a high-quality image ranking in a retrieval task; second, we would expect random views of an image to be ranked closer to a reference view of that image than random views of other images. Hence, we model representation learning as a learning to rank problem for image retrieval. We train a representation encoder by maximizing average precision (AP) for ranking, where random views of an image are considered positively related, and that of the other images considered negatives. The new framework, dubbed S2R2, enables computing a global objective on multiple views, compared to the local objective in the popular contrastive learning framework, which is calculated on pairs of views. In principle, by using a ranking criterion, we eliminate reliance on object-centric curated datasets. When trained on STL10 and MS-COCO, S2R2 outperforms SimCLR and the clustering-based contrastive learning model, SwAV, while being much simpler both conceptually and at implementation. On MS-COCO, S2R2 outperforms both SwAV and SimCLR with a larger margin than on STl10. This indicates that S2R2 is more effective on diverse scenes and could eliminate the need for an object-centric large training dataset for self-supervised representation learning.
翻译:我们的工作基于两种直觉:第一,图像的良好表现必须在检索任务中产生高质量的图像排名;第二,我们期望对图像的随机看法比其他图像的随机看法更接近于该图像的参考观点。因此,我们将学习作为图像检索问题排序的学习模式。我们通过将图像排序最大化,对从图像获取的大量随机观点(放大)进行平均精确度(AP)的排序。我们的工作基于两种直觉:第一,图像的良好表现必须在检索任务中产生高质量的图像排名;第二,我们期望将图像的随机观点排在更接近该图像的参考观点的排名上,而不是对其他图像的随机查看。因此,我们将学习作为学习图像检索问题排序的学习模式进行模拟。我们培训一个代表方,在排序中最大限度地达到平均精确度(AP)和MS-CO,S2R2的随机精确度高于目标,而其他图像则被认为是负面的。新的框架,调制成的SMCR2,能够计算出多重观点的全球目标,而与流行对比学习框架中的当地目标对比性学习框架相比,原则上,我们可不再依赖以目标为中心的多样化数据组合数据集。SMCLR2,在SM-SM2和S-C-C-SMI-C-SMI-SMI-SM-SM-SM-SM-SM-I-I-SM-SM-SM-SM-SM-S-S-S-SM-SM-SM-SM-SM-SM-I-I-SM-SM-SM-SM-SM-I-SM-SM-SM-I-I-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-