Query graph construction aims to construct the correct executable SPARQL on the KG to answer natural language questions. Although recent methods have achieved good results using neural network-based query graph ranking, they suffer from three new challenges when handling more complex questions: 1) complicated SPARQL syntax, 2) huge search space, and 3) locally ambiguous query graphs. In this paper, we provide a new solution. As a preparation, we extend the query graph by treating each SPARQL clause as a subgraph consisting of vertices and edges and define a unified graph grammar called AQG to describe the structure of query graphs. Based on these concepts, we propose a novel end-to-end model that performs hierarchical autoregressive decoding to generate query graphs. The high-level decoding generates an AQG as a constraint to prune the search space and reduce the locally ambiguous query graph. The bottom-level decoding accomplishes the query graph construction by selecting appropriate instances from the preprepared candidates to fill the slots in the AQG. The experimental results show that our method greatly improves the SOTA performance on complex KGQA benchmarks. Equipped with pre-trained models, the performance of our method is further improved, achieving SOTA for all three datasets used.
翻译:查询图的构造旨在为 KG 构建正确可执行的 SPARQL 以解答自然语言问题。 虽然最近的方法已经通过神经网络查询图的查询图表排名取得了良好结果, 但是在处理更复杂的问题时,它们遇到了三个新的挑战:(1) 复杂的 SPARQL 语法,(2) 巨大的搜索空间, 和(3) 本地模糊的查询图。 在本文件中, 我们提供了一个新的解决方案。 作为准备, 我们扩展查询图, 将每个 SPARQL 条款作为子图解, 由顶点和边缘组成, 并定义一个称为 AQG 的统一的图形语法, 以描述查询图的结构。 基于这些概念, 我们提议了一个新的端到端模式, 进行等级自下至端的自下到端的解码, 以生成查询图。 高级别解码生成了 AQG, 作为缩小搜索空间的制约, 并减少本地模糊的查询图。 底层解码通过从预准备的候选人中选择合适的实例来填补 AQG 的空档。 实验结果显示, 我们使用的SOTA 3 方法大大改进了我们用于SOTA 之前的功能基准。