Deploying graph neural networks (GNNs) on whole-graph classification or regression tasks is known to be challenging: it often requires computing node features that are mindful of both local interactions in their neighbourhood and the global context of the graph structure. GNN architectures that navigate this space need to avoid pathological behaviours, such as bottlenecks and oversquashing, while ideally having linear time and space complexity requirements. In this work, we propose an elegant approach based on propagating information over expander graphs. We leverage an efficient method for constructing expander graphs of a given size, and use this insight to propose the EGP model. We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up, and provide evidence of its empirical utility on relevant graph classification datasets and baselines in the Open Graph Benchmark. Importantly, using expander graphs as a template for message passing necessarily gives rise to negative curvature. While this appears to be counterintuitive in light of recent related work on oversquashing, we theoretically demonstrate that negatively curved edges are likely to be required to obtain scalable message passing without bottlenecks. To the best of our knowledge, this is a previously unstudied result in the context of graph representation learning, and we believe our analysis paves the way to a novel class of scalable methods to counter oversquashing in GNNs.
翻译:在整形分类或回归任务中部署图形神经网络(GNNs)被认为是具有挑战性的:它往往需要计算节点功能,既注意其周边的当地互动,又注意图形结构的全球背景。导航这一空间的GNN结构需要避免病理行为,例如瓶颈和过度夸大,同时最好有线性的时间和空间复杂性要求。在这项工作中,我们建议一种基于在扩展图上传播信息的优雅方法。我们利用一种高效的方法来构建一个特定大小的扩展图,并利用这种洞察力来提出 EGP 模型。我们表明,EGP能够解决所有上述关切,同时需要最低限度的努力来建立,并提供其在公开图表基准中的相关图表分类数据集和基线上的经验效用的证据。重要的是,使用扩展图作为信息传递的模板必然会产生负面的曲度。鉴于最近关于过分夸大图的工作,这似乎是不直视的。我们理论上表明,在理论上需要负曲线边缘来获得不可估量的图像,同时需要获得可扩展的图像背景,同时需要最低限度的努力,在公开图表基准中提供我们先前无法理解的模型分析的模型结果。