节点分类任务是一种算法,其必须通过查看其邻居的标签来确定样本(表示为节点)的标签。

VIP内容

类不平衡问题作为学习节点表示的一个重要问题,越来越受到社会的关注。尽管现有研究中所考虑的不平衡源于不同类别中标记示例的数量不等(数量不平衡),但我们认为,图数据暴露了不平衡的唯一来源,即标记节点的不对称拓扑属性,即:标记节点在图中的结构角色不平等(拓扑不平衡)。在本工作中,我们首先探讨了之前未知的拓扑不平衡问题,包括其特征、原因和对半监督节点分类学习的威胁。然后通过标签传播算法考虑节点影响转移现象,提供了一个统一的视角来共同分析数量不平衡和拓扑不平衡问题。根据我们的分析,我们设计了一种基于影响冲突检测——基于度量的Totoro来测量图拓扑不平衡的程度,并提出了一种模型无关的ReNode方法来解决拓扑不平衡问题,方法是根据标记节点相对于类边界的相对位置对其影响进行自适应加权。系统实验证明了该方法在缓解拓扑不平衡问题和促进半监督节点分类方面的有效性和可泛化性。进一步的分析揭示了不同的图神经网络对拓扑不平衡的敏感性不同,为评价图神经网络体系结构提供了新的视角。

https://www.zhuanzhi.ai/paper/e4392c7e18418db5eab9b0f759470985

成为VIP会员查看完整内容
0
4

最新内容

Transformers have achieved remarkable performance in a myriad of fields including natural language processing and computer vision. However, when it comes to the graph mining area, where graph neural network (GNN) has been the dominant paradigm, transformers haven't achieved competitive performance, especially on the node classification task. Existing graph transformer models typically adopt fully-connected attention mechanism on the whole input graph and thus suffer from severe scalability issues and are intractable to train in data insufficient cases. To alleviate these issues, we propose a novel Gophormer model which applies transformers on ego-graphs instead of full-graphs. Specifically, Node2Seq module is proposed to sample ego-graphs as the input of transformers, which alleviates the challenge of scalability and serves as an effective data augmentation technique to boost model performance. Moreover, different from the feature-based attention strategy in vanilla transformers, we propose a proximity-enhanced attention mechanism to capture the fine-grained structural bias. In order to handle the uncertainty introduced by the ego-graph sampling, we further propose a consistency regularization and a multi-sample inference strategy for stabilized training and testing, respectively. Extensive experiments on six benchmark datasets are conducted to demonstrate the superiority of Gophormer over existing graph transformers and popular GNNs, revealing the promising future of graph transformers.

0
0
下载
预览

最新论文

Transformers have achieved remarkable performance in a myriad of fields including natural language processing and computer vision. However, when it comes to the graph mining area, where graph neural network (GNN) has been the dominant paradigm, transformers haven't achieved competitive performance, especially on the node classification task. Existing graph transformer models typically adopt fully-connected attention mechanism on the whole input graph and thus suffer from severe scalability issues and are intractable to train in data insufficient cases. To alleviate these issues, we propose a novel Gophormer model which applies transformers on ego-graphs instead of full-graphs. Specifically, Node2Seq module is proposed to sample ego-graphs as the input of transformers, which alleviates the challenge of scalability and serves as an effective data augmentation technique to boost model performance. Moreover, different from the feature-based attention strategy in vanilla transformers, we propose a proximity-enhanced attention mechanism to capture the fine-grained structural bias. In order to handle the uncertainty introduced by the ego-graph sampling, we further propose a consistency regularization and a multi-sample inference strategy for stabilized training and testing, respectively. Extensive experiments on six benchmark datasets are conducted to demonstrate the superiority of Gophormer over existing graph transformers and popular GNNs, revealing the promising future of graph transformers.

0
0
下载
预览
Top