Grounding dialogue on external knowledge and interpreting linguistic patterns in dialogue history context, such as ellipsis, anaphora, and co-references is critical for dialogue comprehension and generation. In this paper, we present a novel open-domain dialogue generation model which effectively utilizes the large-scale commonsense and named entity based knowledge in addition to the unstructured topic-specific knowledge associated with each utterance. We enhance the commonsense knowledge with named entity-aware structures using co-references. Our proposed model utilizes a multi-hop attention layer to preserve the most accurate and critical parts of the dialogue history and the associated knowledge. In addition, we employ a Commonsense and Named Entity Enhanced Attention Module, which starts with the extracted triples from various sources and gradually finds the relevant supporting set of triples using multi-hop attention with the query vector obtained from the interactive dialogue-knowledge module. Empirical results on two benchmark dataset demonstrate that our model significantly outperforms the state-of-the-art methods in terms of both automatic evaluation metrics and human judgment. Our code is publicly available at \href{https://github.com/deekshaVarshney/CNTF}{https://github.com/deekshaVarshney/CNTF}; \href{https://www.iitp.ac.in/~ai-nlp-ml/resources/codes/CNTF.zip}{https://www.iitp.ac.in/-ai-nlp-ml/resources/ codes/CNTF.zip}.
翻译:关于外部知识和解释对话历史背景中语言模式的地面对话,如 Ellipsis、 anaphora 和 共同参照,对于对话的理解和生成至关重要。 在本文中,我们展示了一个新的开放域对话生成模型,该模型除了与每个发言相关的非结构化专题知识外,还有效利用了大规模常识和命名实体知识。我们在两个基准数据集中利用共同参照加强了与命名实体认知结构的常识知识。我们提议的模型利用多窗口关注层来保存对话历史和相关知识中最准确和最关键的部分。此外,我们使用一个公域和命名实体增强注意模块,该模块从各种来源提取的三重数据开始,并逐渐利用从互动对话-知识模块获取的查询矢量中找到相关的三重支持组合。 Empiriccalal结果显示,我们的模型在自动评价指标和人类判断方面都大大超越了状态-艺术方法。我们的代码可在以下公开查阅: CN\\ http_reflifar_Vgresrus_Vgyrus_qus/ comcom/comps.