Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data. Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks by simply operating on graph-smoothed node features, rather than using end-to-end learned feature hierarchies that are challenging to scale to large graphs. In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities. We propose Neighbor Averaging over Relation Subgraphs (NARS), which trains a classifier on neighbor-averaged features for randomly-sampled subgraphs of the "metagraph" of relations. We describe optimizations to allow these sets of node features to be computed in a memory-efficient way, both at training and inference time. NARS achieves a new state of the art accuracy on several benchmark datasets, outperforming more expensive GNN-based methods
翻译:图像神经网络( GNNS) 是一个广受欢迎的模拟模型, 用于通过图形结构化的数据来学习。 最近的工作表明, GNNS 主要是使用图表来平滑地貌,并且通过仅仅使用图形式节点特征来显示基准任务的竞争结果,而不是使用对大图规模具有挑战性的端到端学的特征等级。 在这项工作中, 我们询问这些结果是否可以推广到多元的图形, 以不同实体之间多种类型的关系编码。 我们建议 Neearbor Vivering over Relation Subgraphs( NARS), 以对邻居平均特征进行培训, 用于随机抽样的“ 数字” 子图。 我们描述优化, 以便允许这些节点特征以记忆效率的方式计算, 无论是在培训还是在推断时间。 NARS 在几个基准数据集上实现了新的艺术准确度状态, 其表现优于以 GNNN 为基础的方法。