Contrastive learning is a method of learning visual representations by training Deep Neural Networks (DNNs) to increase the similarity between representations of positive pairs (transformations of the same image) and reduce the similarity between representations of negative pairs (transformations of different images). Here we explore Energy-Based Contrastive Learning (EBCLR) that leverages the power of generative learning by combining contrastive learning with Energy-Based Models (EBMs). EBCLR can be theoretically interpreted as learning the joint distribution of positive pairs, and it shows promising results on small and medium-scale datasets such as MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100. Specifically, we find EBCLR demonstrates from X4 up to X20 acceleration compared to SimCLR and MoCo v2 in terms of training epochs. Furthermore, in contrast to SimCLR, we observe EBCLR achieves nearly the same performance with 254 negative pairs (batch size 128) and 30 negative pairs (batch size 16) per positive pair, demonstrating the robustness of EBCLR to small numbers of negative pairs. Hence, EBCLR provides a novel avenue for improving contrastive learning methods that usually require large datasets with a significant number of negative pairs per iteration to achieve reasonable performance on downstream tasks. Code: https://github.com/1202kbs/EBCLR
翻译:通过培训深神经网络(DNNS)来提高正对(同一图像的变换)和减少负对(不同图像的变换)的表示的相似性,我们在这里探索以能源为基础的反比学习(EBCLR),通过将对比学习与以能源为基础的模型(EBLM)相结合,利用基因学习的力量来发挥变异性学习的力量。EBCLR可以理论上解释为学习正对对的混合分布,并显示中小型和中型数据集,如MNIST、Fashaon-MNIST、CIFAR-10和CIFAR-100等的表示相似性,并减少负对等的表示。具体地说,我们发现EBCLR在培训方面显示,从X4到X20加速到SimCLRR和MOC V2,这在培训方面比SimCLR和MOC V2加速力。此外,我们观察到,EBLLR的成绩几乎相同,有254对(批128)和30双负对(分16,分),显示中每正对的中显示中,显示EBLR的大幅反比要求。