Recently, Graph Neural Networks (GNNs) achieve remarkable success in Recommendation. To reduce the influence of data sparsity, Graph Contrastive Learning (GCL) is adopted in GNN-based CF methods for enhancing performance. Most GCL methods consist of data augmentation and contrastive loss (e.g., InfoNCE). GCL methods construct the contrastive pairs by hand-crafted graph augmentations and maximize the agreement between different views of the same node compared to that of other nodes, which is known as the InfoMax principle. However, improper data augmentation will hinder the performance of GCL. InfoMin principle, that the good set of views shares minimal information and gives guidelines to design better data augmentation. In this paper, we first propose a new data augmentation (i.e., edge-operating including edge-adding and edge-dropping). Then, guided by InfoMin principle, we propose a novel theoretical guiding contrastive learning framework, named Learnable Data Augmentation for Graph Contrastive Learning (LDA-GCL). Our methods include data augmentation learning and graph contrastive learning, which follow the InfoMin and InfoMax principles, respectively. In implementation, our methods optimize the adversarial loss function to learn data augmentation and effective representations of users and items. Extensive experiments on four public benchmark datasets demonstrate the effectiveness of LDA-GCL.
翻译:最近,图表神经网络(GNN)在建议中取得了显著成功。为减少数据宽度的影响,GNN采用基于GN的CF方法,采用图表对比学习(GCL)来提高性能。大多数GL方法包括数据增强和对比性损失(例如InfoNCE)。GL方法通过手工制作的图形增强(InfoNCE)建立对比配对,并尽量扩大同一节点的不同观点与其他节点(称为InfoMax原则)的对比性学习(LDA-GCL)之间的协议。然而,不适当的数据增强将妨碍GCL.InfoMin原则的绩效,即好的一组观点共享最低限度的信息,并为设计更好的数据增强能力提供指南。在本文件中,我们首先提出一个新的数据增强(即边际操作,包括边际添加和边际倾斜)方法。随后,在InfoMin原则的指导下,我们提出了一个新的理论指导对比性学习框架,名为“图形对比性学习数据增强性数据增强性学习”原则(LDA-GC)。我们的方法包括数据增强性和图表对比性学习学习,并分别遵循信息和升级数据测试工具。