Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and to generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactorGNNs. This new architecture draws upon both modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactorGNNs. Across a multitude of well-established KGC benchmarks, our ReFactorGNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.
翻译:DistMult等基于集成的模型(FMs)在完成知识图表(KGC)任务方面一直取得了成功,往往优于图形神经网络(GNNs)。然而,与GNNs不同的是,调频在感化环境中努力纳入节点特征并概括到不可见节点。我们的工作通过提议 ReFactorGNs 来弥合调频和GNNs 之间的差距。这个新结构借鉴了原先被认为大为脱节的两种模型。具体地说,我们使用一种信息传递形式主义,展示调频如何通过将梯度下降程序重新定位为GNNPs,作为信息传递操作,构成我们ReFactorGNs的基础。在很多已经确立的KGC基准中,我们的ReFactorGNNs在使用数量较少的参数的同时,实现了与调频相近的传输性性能和最先进的感应性性性表现。