Self-supervised learning has recently achieved great success in representation learning without human annotations. The dominant method -- that is contrastive learning, is generally based on instance discrimination tasks, i.e., individual samples are treated as independent categories. However, presuming all the samples are different contradicts the natural grouping of similar samples in common visual datasets, e.g., multiple views of the same dog. To bridge the gap, this paper proposes an adaptive method that introduces soft inter-sample relations, namely Adaptive Soft Contrastive Learning (ASCL). More specifically, ASCL transforms the original instance discrimination task into a multi-instance soft discrimination task, and adaptively introduces inter-sample relations. As an effective and concise plug-in module for existing self-supervised learning frameworks, ASCL achieves the best performance on several benchmarks in terms of both performance and efficiency. Code is available at https://github.com/MrChenFeng/ASCL_ICPR2022.
翻译:自我监督的学习最近在没有人文说明的代言学习方面取得了巨大成功。主要方法 -- -- 即对比性学习,一般以实例歧视任务为基础,即将单个样本作为独立类别处理,然而,假设所有样本都不同,与共同视觉数据集中类似样本的自然分组相矛盾,例如同一狗的多重观点。为了缩小差距,本文件建议采用适应性方法,引入软性抽样关系,即适应性软性软性软性反差学习。更具体地说,ASCL将原案例歧视任务转化为多因应软性歧视任务,并适应性地引入了样本间关系。作为现有自我监督学习框架的一个有效和简洁的插件模块,ASCL在业绩和效率两个基准上都取得了最佳业绩。 守则可在https://github.com/MrchenFeng/ASCL_ICPR2022上查阅。