Although neural models have achieved competitive results in dialogue systems, they have shown limited ability in representing core semantics, such as ignoring important entities. To this end, we exploit Abstract Meaning Representation (AMR) to help dialogue modeling. Compared with the textual input, AMR explicitly provides core semantic knowledge and reduces data sparsity. We develop an algorithm to construct dialogue-level AMR graphs from sentence-level AMRs and explore two ways to incorporate AMRs into dialogue systems. Experimental results on both dialogue understanding and response generation tasks show the superiority of our model. To our knowledge, we are the first to leverage a formal semantic representation into neural dialogue modeling.
翻译:尽管神经模型在对话系统中取得了竞争性结果,但它们在代表核心语义学方面表现出能力有限,例如忽视重要实体。为此目的,我们利用抽象含义代表(AMR)来帮助对话建模。与文本输入相比,AMR明确提供了核心语义知识,减少了数据广度。我们开发了一种算法,从判决级别AMR中构建对话级别AMR图,并探索了将AMR纳入对话系统的两种方法。对话理解和反应生成任务的实验结果显示了我们模式的优越性。据我们所知,我们是第一个将正式语义代表用于神经对话建模的。