Conversation generation as a challenging task in Natural Language Generation (NLG) has been increasingly attracting attention over the last years. A number of recent works adopted sequence-to-sequence structures along with external knowledge, which successfully enhanced the quality of generated conversations. Nevertheless, few works utilized the knowledge extracted from similar conversations for utterance generation. Taking conversations in customer service and court debate domains as examples, it is evident that essential entities/phrases, as well as their associated logic and inter-relationships can be extracted and borrowed from similar conversation instances. Such information could provide useful signals for improving conversation generation. In this paper, we propose a novel reading and memory framework called Deep Reading Memory Network (DRMN) which is capable of remembering useful information of similar conversations for improving utterance generation. We apply our model to two large-scale conversation datasets of justice and e-commerce fields. Experiments prove that the proposed model outperforms the state-of-the-art approaches.
翻译:过去几年来,作为自然语言世代(NLG)中一项具有挑战性的任务,对话的产生日益引起人们的注意。最近的一些著作采用了顺序到顺序的结构以及外部知识,从而成功地提高了对话的质量。然而,很少有作品利用从类似对话中获得的知识来发声。把客户服务和法院辩论领域的谈话作为例子,显然可以从类似的对话中提取和借用基本实体/发言及其相关的逻辑和相互关系。这种信息可以为改善对话的产生提供有用的信号。在本文中,我们提议了一个叫作深读记忆网(DRMN)的新读和记忆框架,它能够记住类似对话的有用信息来改善发声。我们把我们的模型应用于两个大型对话的司法和电子商务领域。实验证明,拟议的模型比最新方法要好。