Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data. Existing efforts on GNN have largely defined the graph convolution as a weighted sum of the features of the connected nodes to form the representation of the target node. Nevertheless, the operation of weighted sum assumes the neighbor nodes are independent of each other, and ignores the possible interactions between them. When such interactions exist, such as the co-occurrence of two neighbor nodes is a strong signal of the target node's characteristics, existing GNN models may fail to capture the signal. In this work, we argue the importance of modeling the interactions between neighbor nodes in GNN. We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes. We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes. In particular, we specify two BGNN models named BGCN and BGAT, based on the well-known GCN and GAT, respectively. Empirical results on three public benchmarks of semi-supervised node classification verify the effectiveness of BGNN -- BGCN (BGAT) outperforms GCN (GAT) by 1.6% (1.5%) in classification accuracy.Codes are available at: https://github.com/zhuhm1996/bgnn.
翻译:GNN的现有努力在很大程度上将图形组合定义为连接节点特征的加权总和,以形成目标节点的表示。然而,加权总和的操作假设邻居节点相互独立,忽视它们之间的可能互动。当存在这种互动时,例如两个相邻节点的共聚是目标节点特点的强烈信号,现有的GNNN模型可能无法捕捉信号。在这项工作中,我们提出模拟GNN的相邻节点之间相互作用的重要性。我们提议一个新的图形组合操作器,通过邻居节点的对称互动来增加加权和。我们把这个框架称为Binlinear图形神经网络(BGNNN),它提高GNN在邻居节点之间双线互动的能力。我们特别根据众所周知的GCN和GNAT的GNA(GNT) 准确性分类,分别用GGGNT/GNT的双向基准进行。