We aim to better understand attention over nodes in graph neural networks (GNNs) and identify factors influencing its effectiveness. We particularly focus on the ability of attention GNNs to generalize to larger, more complex or noisy graphs. Motivated by insights from the work on Graph Isomorphism Networks, we design simple graph reasoning tasks that allow us to study attention in a controlled environment. We find that under typical conditions the effect of attention is negligible or even harmful, but under certain conditions it provides an exceptional gain in performance of more than 60% in some of our classification tasks. Satisfying these conditions in practice is challenging and often requires optimal initialization or supervised training of attention. We propose an alternative recipe and train attention in a weakly-supervised fashion that approaches the performance of supervised models, and, compared to unsupervised models, improves results on several synthetic as well as real datasets. Source code and datasets are available at https://github.com/bknyaz/graph_attention_pool.
翻译:我们的目标是更好地了解图形神经网络(GNNs)对节点的注意,并找出影响其效力的因素。我们特别侧重于注意GNNs普及更大、更复杂或更吵的图表的能力。我们受“图象形态网络”工作的深入启发,设计了简单的图表推理任务,使我们能够在受控制的环境中研究注意力。我们发现,在典型条件下,关注的影响微不足道,甚至有害,但在某些条件下,在某些分类任务中,它提供了超过60%的超乎寻常的收益。在实践中,满足这些条件具有挑战性,往往需要最佳的初始化或监管的注意培训。我们提出了一个替代食谱,并以薄弱的监管方式对注意力进行培训,以接近受监督模型的性能,与不受监督的模式相比,改进若干合成和真实数据集的结果。资料来源代码和数据集可在https://github.com/bknyaz/graph_ative_apol查阅。