Relation prediction on knowledge graphs (KGs) is a key research topic. Dominant embedding-based methods mainly focus on the transductive setting and lack the inductive ability to generalize to new entities for inference. Existing methods for inductive reasoning mostly mine the connections between entities, i.e., relational paths, without considering the nature of head and tail entities contained in the relational context. This paper proposes a novel method that captures both connections between entities and the intrinsic nature of entities, by simultaneously aggregating RElational Paths and cOntext with a unified hieRarchical Transformer framework, namely REPORT. REPORT relies solely on relation semantics and can naturally generalize to the fully-inductive setting, where KGs for training and inference have no common entities. In the experiments, REPORT performs consistently better than all baselines on almost all the eight version subsets of two fully-inductive datasets. Moreover. REPORT is interpretable by providing each element's contribution to the prediction results.
翻译:Inductive Relation Prediction from Relational Paths and Context with Hierarchical Transformers的意思是使用分层Transformer从关系路径和上下文进行归纳式关系预测。知识图谱上的关系预测是一个重要的研究课题。目前的基于嵌入的方法主要关注于传递性设置,并缺乏归纳能力以推广到新实体进行推理。现有的归纳推理方法主要挖掘实体之间的连接,即关系路径,而不考虑包含在关系上下文中的头实体和尾实体的性质。本文提出了一种新方法,通过使用统一的分层Transformer框架同时聚合关系路径和上下文来捕捉实体之间的连接和实体的本质特征,名为使用分层Transformer从关系路径和上下文进行归纳式关系预测(REPORT)。REPORT仅依赖关系语义,并且可以自然地推广到完全归纳式的设置,其中训练和推理的KGs没有共同的实体。在实验中,REPORT在两个完全归纳式数据集的八个版本子集中几乎始终优于所有基线。此外,REPORT的结果是可解释的,可以提供每个元素对预测结果的贡献。