Graph neural networks (GNN) have been ubiquitous in graph node classification tasks. Most of GNN methods update the node embedding iteratively by aggregating its neighbors' information. However, they often suffer from negative disturbance, due to edges connecting nodes with different labels. One approach to alleviate this negative disturbance is to use attention to learn the weights of aggregation, but current attention-based GNNs only consider feature similarity and also suffer from the lack of supervision. In this paper, we consider the label dependency of graph nodes and propose a decoupling attention mechanism to learn both hard and soft attention. The hard attention is learned on labels for a refined graph structure with fewer inter-class edges, so that the aggregation's negative disturbance can be reduced. The soft attention aims to learn the aggregation weights based on features over the refined graph structure to enhance information gains during message passing. Particularly, we formulate our model under the EM framework, and the learned attention is used to guide the label propagation in the M-step and the feature propagation in the E-step, respectively. Extensive experiments are performed on six well-known benchmark graph datasets to verify the effectiveness of the proposed method.
翻译:图形节点分类任务( GNN) 。 GNN 方法大多通过汇集邻居的信息来更新节点的迭代嵌入。 但是,由于连接节点和不同标签的边缘,它们经常受到负面干扰。 缓解这种负面扰动的方法之一是利用注意力来了解聚合的重量, 但当前基于关注的GNN 仅仅考虑到特征相似性, 也缺乏监督。 在本文中, 我们考虑图形节点的标签依赖性, 并提议一个脱钩关注机制, 以学习硬和软的注意。 对精细的图表结构的标签有强烈的注意, 从而可以减少组合的负面扰动。 软关注的目的是根据精细的图形结构的特征来学习汇总权重, 以便在信息传递过程中加强信息收益。 特别是, 我们根据EM 框架来制定我们的模型, 所学到的注意力被用来指导M级的标签传播和E级的特征传播。 广泛实验分别在六个众所周知的基准图表方法上进行, 以核实拟议的有效性。