Knowledge graph (KG) link prediction aims to infer new facts based on existing facts in the KG. Recent studies have shown that using the graph neighborhood of a node via graph neural networks (GNNs) provides more useful information compared to just using the query information. Conventional GNNs for KG link prediction follow the standard message-passing paradigm on the entire KG, which leads to over-smoothing of representations and also limits their scalability. On a large scale, it becomes computationally expensive to aggregate useful information from the entire KG for inference. To address the limitations of existing KG link prediction frameworks, we propose a novel retrieve-and-read framework, which first retrieves a relevant subgraph context for the query and then jointly reasons over the context and the query with a high-capacity reader. As part of our exemplar instantiation for the new framework, we propose a novel Transformer-based GNN as the reader, which incorporates graph-based attention structure and cross-attention between query and context for deep fusion. This design enables the model to focus on salient context information relevant to the query. Empirical results on two standard KG link prediction datasets demonstrate the competitive performance of the proposed method.
翻译:知识图( KG) 链接的预测旨在根据 KG 中现有事实推断新的事实。 最近的研究表明, 使用图形神经网络( GNNs) 中一个节点的图形周围, 通过图形神经网络( GNNs) 提供比仅仅使用查询信息更有用的信息。 KG 的常规 GNns 链接预测遵循整个 KG 的标准信息传递模式, 导致表达方式的过度移动, 并限制其可缩放性。 大规模而言, 将整个 KG 的有用信息汇总起来进行推断, 其计算成本很高。 为了解决现有的 KG 链接预测框架的局限性, 我们提出了一个新的检索和阅读框架, 该框架首先检索查询一个相关的子绘图背景, 然后与一个高能力读者一起对背景和查询提出共同的理由。 作为我们新框架的Exempliclaration 即时的一部分, 我们提议以新型的 GNNN为读者, 其中包含基于图形的注意结构以及深度凝聚的查询和背景之间的交叉注意。 这个设计使得模型能够侧重于与竞争性预测方法相关的显著背景信息 。 Emporal 数据结果 演示了 K 标准 。