Deploying graph neural networks (GNNs) on whole-graph classification or regression tasks is known to be challenging: it often requires computing node features that are mindful of both local interactions in their neighbourhood and the global context of the graph structure. GNN architectures that navigate this space need to avoid pathological behaviours, such as bottlenecks and oversquashing, while ideally having linear time and space complexity requirements. In this work, we propose an elegant approach based on propagating information over expander graphs. We provide an efficient method for constructing expander graphs of a given size, and use this insight to propose the EGP model. We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up, and provide evidence of its empirical utility on relevant datasets and baselines in the Open Graph Benchmark. Importantly, using expander graphs as a template for message passing necessarily gives rise to negative curvature. While this appears to be counterintuitive in light of recent related work on oversquashing, we theoretically demonstrate that negatively curved edges are likely to be required to obtain scalable message passing without bottlenecks. To the best of our knowledge, this is a previously unstudied result in the context of graph representation learning, and we believe our analysis paves the way to a novel class of scalable methods to counter oversquashing in GNNs.
翻译:在整形分类或回归任务中部署图形神经网络(GNNs)被认为是具有挑战性的:它往往需要计算节点功能,既注意其周边的当地互动,又注意图形结构的全球背景。导航这一空间的GNN结构需要避免病理行为,例如瓶颈和过度夸大,同时最好有线性的时间和空间复杂性要求。在这项工作中,我们建议一种基于在扩展图上传播信息的优雅方法。我们为构建特定大小的扩展图提供了一种高效的方法,并利用这种洞察力来提出EGP模式。我们表明,EGP能够解决所有上述关切,同时需要做最低限度的努力来建立,并提供其在公开图表基准中相关数据集和基线上的经验效用的证据。重要的是,使用扩展图作为信息传递的模板必然会产生负面的曲度。鉴于最近关于过分夸大图的工作,这似乎是不明智的。我们理论上表明,在理论上需要负曲线边缘才能获得可扩展的信息,同时需要做所有上述关切,同时需要做最低限度的努力,同时提供在公开图表基准中提供其经验的实用效用的实用性效用证据。我们以前相信的图表分析。