With the rapid growth of knowledge bases (KBs), question answering over knowledge base, a.k.a. KBQA has drawn huge attention in recent years. Most of the existing KBQA methods follow so called encoder-compare framework. They map the question and the KB facts to a common embedding space, in which the similarity between the question vector and the fact vectors can be conveniently computed. This, however, inevitably loses original words interaction information. To preserve more original information, we propose an attentive recurrent neural network with similarity matrix based convolutional neural network (AR-SMCNN) model, which is able to capture comprehensive hierarchical information utilizing the advantages of both RNN and CNN. We use RNN to capture semantic-level correlation by its sequential modeling nature, and use an attention mechanism to keep track of the entities and relations simultaneously. Meanwhile, we use a similarity matrix based CNN with two-directions pooling to extract literal-level words interaction matching utilizing CNNs strength of modeling spatial correlation among data. Moreover, we have developed a new heuristic extension method for entity detection, which significantly decreases the effect of noise. Our method has outperformed the state-of-the-arts on SimpleQuestion benchmark in both accuracy and efficiency.
翻译:随着知识基础(KBS)的迅速增长,对知识基础的回答问题,a.k.a.a.a.a.a.k.KBQA在最近几年里引起了巨大的关注。现有的KBQA方法大多采用所谓的编码器-compare框架。它们将问题和KB事实映射到一个共同的嵌入空间,在这个空间中,可以方便地计算问题矢量和事实矢量之间的相似性。然而,这不可避免地会失去原始的词互动信息。为了保存更原始的信息,我们提议建立一个关注的经常性神经网络,以类似矩阵基基基基共的同源神经网络(AR-SMCNN)模型(AR-SMCNN)为主,能够利用RNN和CNN的优势捕捉到全面的等级信息。我们用RNNN和CNN的顺序模型性能,我们用一个关注机制来捕捉语系层面的关联性关系。与此同时,我们使用一个基于两个方向的类似性矩阵,用两个方向汇集来提取精度的词互动。此外,我们还开发了一个新的超音频扩展方法,用以测量实体的精确度。