Heterogeneous graph neural networks (HGNNs) deliver the powerful capability to embed rich structural and semantic information of a heterogeneous graph into low-dimensional node representations. Existing HGNNs usually learn to embed information using hierarchy attention mechanism and repeated neighbor aggregation, suffering from unnecessary complexity and redundant computation. This paper proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN) which reduces this excess complexity through avoiding overused node-level attention within the same relation and pre-computing the neighbor aggregation in the pre-processing stage. Unlike previous work, SeHGNN utilizes a light-weight parameter-free neighbor aggregator to learn structural information for each metapath, and a transformer-based semantic aggregator to combine semantic information across metapaths for the final embedding of each node. As a result, SeHGNN offers the simple network structure, high prediction accuracy, and fast training speed. Extensive experiments on five real-world heterogeneous graphs demonstrate the superiority of SeHGNN over the state-of-the-arts on both the accuracy and training speed. Codes are available at https://github.com/ICT-GIMLab/SeHGNN.
翻译:超异性图形神经网络(HGNN)提供了强大的能力,将多元图形的丰富结构和语义信息嵌入低维节点表示中。现有的HGNN通常学会使用等级关注机制和反复相邻聚合,进行不必要的复杂和冗余计算,将信息嵌入等级关注机制和反复相邻聚合。本文建议简单高效的超异性图形神经网络(SeHGNN)通过避免在同一个关系中过度使用节点关注和在预处理阶段预先计算邻居聚合,减少这种超复杂程度。与以往的工作不同,SHGNN利用一个轻量无参数邻居聚合器学习每种元路径的结构信息,以及一个基于变异器的语义聚合器,将每个节点最终嵌入的元体中语义信息结合起来。因此,SEHGNNN提供简单的网络结构、高预测准确度和快速培训速度。五种真实世界混凝图的广泛实验显示SHGNNNNNN在准确性/MLGGGG/MIG中的位置高于国家/MLG/MLGS/restcode。在http://MA/GIFF/MLS/MLS上都有。