Researchers have recently proposed plenty of heterogeneous graph neural networks (HGNNs) due to the ubiquity of heterogeneous graphs in both academic and industrial areas. Instead of pursuing a more powerful HGNN model, in this paper, we are interested in devising a versatile plug-and-play module, which accounts for distilling relational knowledge from pre-trained HGNNs. To the best of our knowledge, we are the first to propose a HIgh-order RElational (HIRE) knowledge distillation framework on heterogeneous graphs, which can significantly boost the prediction performance regardless of model architectures of HGNNs. Concretely, our HIRE framework initially performs first-order node-level knowledge distillation, which encodes the semantics of the teacher HGNN with its prediction logits. Meanwhile, the second-order relation-level knowledge distillation imitates the relational correlation between node embeddings of different types generated by the teacher HGNN. Extensive experiments on various popular HGNNs models and three real-world heterogeneous graphs demonstrate that our method obtains consistent and considerable performance enhancement, proving its effectiveness and generalization ability.
翻译:最近,由于学术和工业领域差异图形的分布性强,研究人员提出了大量不同图形神经网络(HGNNs),因为学术和工业领域差异性图无处不在。本文中,我们不是要采用更强大的HGNN模式,而是要设计一个多功能的插件和游戏模块,用于从经过预先训练的HGNNs中提取关系知识。据我们所知,我们是第一个提议在多样性图上建立HIgh-秩序知识蒸馏框架,无论HGNNs的模型结构如何,都能够大大提高预测性能。具体地说,我们的HIRE框架最初进行一级节点知识蒸馏,将HGNN教师的语义与预测日志编码。与此同时,第二级关系级知识蒸馏模仿了HGNN教师生成的不同类型无界嵌入层(HGNNs)之间的关联性关系。关于各种流行的HGNNNs模型的广泛实验和三个真实世界的多元性图表表明,我们的方法得到了一致和相当大的提高。