Existing work on augmenting question answering (QA) models with external knowledge (e.g., knowledge graphs) either struggle to model multi-hop relations efficiently, or lack transparency into the model's prediction rationale. In this paper, we propose a novel knowledge-aware approach that equips pre-trained language models (PTLMs) with a multi-hop relational reasoning module, named multi-hop graph relation network (MHGRN). It performs multi-hop, multi-relational reasoning over subgraphs extracted from external knowledge graphs. The proposed reasoning module unifies path-based reasoning methods and graph neural networks to achieve better interpretability and scalability. We also empirically show its effectiveness and scalability on CommonsenseQA and OpenbookQA datasets, and interpret its behaviors with case studies.
翻译:以外部知识(例如,知识图)为主的强化问题解答模型(QA)的现有工作,要么是努力以高效率的方式模拟多机会关系,要么是在模型的预测理由方面缺乏透明度。在本文中,我们建议采用新的知识觉悟方法,为预先培训的语言模型(PTLM)配备一个多机会关系推理模块,名为多机会图关系网络(MHGRN),对从外部知识图中提取的子图进行多机会、多关系推理。拟议的推理模块统一了基于路径的推理方法和图形神经网络,以便实现更好的解释性和可缩放性。我们还从经验上展示了该模型在常识QA和OpenbookQA数据集上的有效性和可缩放性,并用案例研究来解释其行为。