Recent studies on knowledge graphs (KGs) show that path-based methods empowered by pre-trained language models perform well in the provision of inductive and explainable relation predictions. In this paper, we introduce the concepts of relation path coverage and relation path confidence to filter out unreliable paths prior to model training to elevate the model performance. Moreover, we propose Knowledge Reasoning Sentence Transformer (KRST) to predict inductive relations in KGs. KRST is designed to encode the extracted reliable paths in KGs, allowing us to properly cluster paths and provide multi-aspect explanations. We conduct extensive experiments on three real-world datasets. The experimental results show that compared to SOTA models, KRST achieves the best performance in most transductive and inductive test cases (4 of 6), and in 11 of 12 few-shot test cases.
翻译:最近对知识图表(KGs)的研究显示,通过预先培训的语言模型所增强的基于路径的方法在提供感应和可解释的关系预测方面表现良好。在本文中,我们引入了关系路径覆盖和关系路径信任的概念,以在模型培训之前过滤不可靠的路径,提升模型性能。此外,我们提议了“知识理性句变换器”(KRST),以预测KGs的感应关系。KRST旨在将所提取的可靠路径编码在KGs中,使我们能够正确组合路径并提供多层解释。我们在三个真实世界的数据集上进行了广泛的实验。实验结果显示,与SOTA模型相比,KRST在大多数感应和感应试验案例中取得了最佳的性能(4个),在12个微小测试案例中,有11个。