Task-oriented parsing (TOP) aims to convert natural language into machine-readable representations of specific tasks, such as setting an alarm. A popular approach to TOP is to apply seq2seq models to generate linearized parse trees. A more recent line of work argues that pretrained seq2seq models are better at generating outputs that are themselves natural language, so they replace linearized parse trees with canonical natural-language paraphrases that can then be easily translated into parse trees, resulting in so-called naturalized parsers. In this work we continue to explore naturalized semantic parsing by presenting a general reduction of TOP to abstractive question answering that overcomes some limitations of canonical paraphrasing. Experimental results show that our QA-based technique outperforms state-of-the-art methods in full-data settings while achieving dramatic improvements in few-shot settings.
翻译:以任务为导向的分析(TOP)旨在将自然语言转换成机器可读的具体任务,例如设置警报。对TOP的流行做法是应用后继2当量模型来生成线性剖析树。最近的一项工作认为,预先训练的后继2当量模型在产生产出时更能产生自然语言本身的自然产出,因此它们用能够轻易转换成粗糙的自然语言副句取代线性剖析树,从而导致所谓的自然化剖析师。在这项工作中,我们继续探索自然化的语义分析方法,将TOP普遍缩减为抽象问题,从而克服了对光学性对准参数的一些限制。实验结果显示,我们基于QA的技术在全数据环境中超越了状态的艺术方法,同时在少数情况下实现了显著的改进。