To diversify and enrich generated dialogue responses, knowledge-grounded dialogue has been investigated in recent years. The existing methods tackle the knowledge grounding challenge by retrieving the relevant sentences over a large corpus and augmenting the dialogues with explicit extra information. Despite their success, however, the existing works have drawbacks on the inference efficiency. This paper proposes KnowExpert, an end-to-end framework to bypass the explicit retrieval process and inject knowledge into the pre-trained language models with lightweight adapters and adapt to the knowledge-grounded dialogue task. To the best of our knowledge, this is the first attempt to tackle this challenge without retrieval in this task under an open-domain chit-chat scenario. The experimental results show that KknowExpert performs comparably with some retrieval-based baselines while being time-efficient in inference, demonstrating the potential of our proposed direction.
翻译:近年来,为丰富和丰富生成的对话回应,对基于知识的对话进行了调查; 现有方法应对知识的立足挑战,在大体上重新获取相关句子,并以明确的额外信息增加对话内容; 然而,尽管取得了成功,现有工作在推论效率方面还是有缺陷的。 本文提出“知识专家”,这是一个端到端框架,可以绕过明确的检索进程,将知识注入预先培训的语言模式,与轻量适配者一起适应知识型对话任务。 根据我们的最佳知识,这是第一次试图在开放式主干线聊天情景下,在不检索的情况下应对这一挑战。 实验结果表明,KnowExplort在与一些基于检索的基线相对,同时具有时间效率的推断,显示了我们拟议方向的潜力。