Contrastive learning (CL) has proven highly effective in graph-based semi-supervised learning (SSL), since it can efficiently supplement the limited task information from the annotated nodes in graph. However, existing graph CL (GCL) studies ignore the uneven distribution of task information across graph caused by the graph topology and the selection of annotated nodes. They apply CL to the whole graph evenly, which results in an incongruous combination of CL and graph learning. To address this issue, we propose to apply CL in the graph learning adaptively by taking the received task information of each node into consideration. Firstly, we introduce Group PageRank to measure the node information gain from graph and find that CL mainly works for nodes that are topologically far away from the labeled nodes. We then propose our Distance-wise Graph Contrastive Learning (DwGCL) method from two views:(1) From the global view of the task information distribution across the graph, we enhance the CL effect on nodes that are topologically far away from labeled nodes; (2) From the personal view of each node's received information, we measure the relative distance between nodes and then we adapt the sampling strategy of GCL accordingly. Extensive experiments on five benchmark graph datasets show that DwGCL can bring a clear improvement over previous GCL methods. Our analysis on eight graph neural network with various types of architecture and three different annotation settings further demonstrates the generalizability of DwGCL.
翻译:对比学习(CL)在基于图形的半监督学习(SSL)中被证明非常有效,因为它能够有效地补充图表中注解节点中有限的任务信息。然而,现有的图表CL(GCL)研究忽略了图示表层和选择注解节点造成的跨图层任务信息的分布不均匀。它们以均匀的方式将CL应用到整个图层,从而导致CL和图解学习的不均匀组合。为了解决这一问题,我们提议在图表中应用CL,通过考虑每个节点的接收任务信息来适应性学习。首先,我们引入小组PageRank来测量从图中获取的节点信息,发现CL主要用于远离标注节点的图层和图层。然后,我们从两个角度提出我们的远程图表对比学习(DwGCL)方法,从全球对任务信息分布图层分布的观察中,我们用最上离标注节点的节点来增强CL对每个节点的影响。 从个人视角来看,我们从每个阵列的阵列图中可以显示我们之前的连续的GL图图式图式,然后的图式分析。