Neural network models usually suffer from the challenge of incorporating commonsense knowledge into the open-domain dialogue systems. In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation. In addition, we propose a response guiding attention and a multi-step decoding strategy to steer our model to focus on relevant features for response generation. Experiments on two benchmark datasets demonstrate that our model has robust superiority over compared methods in generating informative and fluent dialogues. Our code is available at https://github.com/siat-nlp/TransDG.
翻译:神经网络模式通常面临将常识知识纳入开放域对话系统的挑战,本文件提出一个新的知识意识对话生成模式(称为DrannsDG),该模式将问题代表性和知识匹配能力与知识基础问答(KBQA)任务相匹配,以促进对对话生成的理解和事实知识选择;此外,我们提议采取应对措施,引导人们注意和多步解码战略,引导我们的模式侧重于相关响应生成特征;对两个基准数据集的实验表明,与生成信息化流畅对话的方法相比,我们的模型具有很强的优势。我们的代码可在https://github.com/siat-nlp/TransDG网站上查阅。