The dominant paradigm for semantic parsing in recent years is to formulate parsing as a sequence-to-sequence task, generating predictions with auto-regressive sequence decoders. In this work, we explore an alternative paradigm. We formulate semantic parsing as a dependency parsing task, applying graph-based decoding techniques developed for syntactic parsing. We compare various decoding techniques given the same pre-trained Transformer encoder on the TOP dataset, including settings where training data is limited or contains only partially-annotated examples. We find that our graph-based approach is competitive with sequence decoders on the standard setting, and offers significant improvements in data efficiency and settings where partially-annotated data is available.
翻译:近年来,语义解析的主要范式是将解析作为一种从顺序到顺序的任务,用自动递减序列解码器作出预测。在这项工作中,我们探索了另一种模式。我们将语义解析作为一种依赖性解析任务,采用为合成法解码开发的基于图表的解码技术。我们比较了在TOP数据集上经过事先培训的相同变异器编码器中的各种解码技术,包括培训数据有限或仅包含部分附加说明实例的设置。我们发现,我们基于图形的方法在标准设置上与序列解码器具有竞争力,并大大提高了数据效率和提供部分附加说明数据的环境。