To diversify and enrich generated dialogue responses, knowledge-grounded dialogue has been investigated in recent years. The existing methods tackle the knowledge grounding challenge by retrieving the relevant sentences over a large corpus and augmenting the dialogues with explicit extra information. Despite their success, however, the existing works have drawbacks in inference efficiency. This paper proposes KnowExpert, a framework to bypass the explicit retrieval process and inject knowledge into the pre-trained language models with lightweight adapters and adapt to the knowledge-grounded dialogue task. To the best of our knowledge, this is the first attempt to tackle this challenge without retrieval in this task under an open-domain chit-chat scenario. The experimental results show that Knowexpert performs comparably with some retrieval-based baselines while being time-efficient in inference, demonstrating the effectiveness of our proposed method.
翻译:最近几年来,为了使对话反应多样化和丰富内容,对基于知识的对话进行了调查; 现有方法通过对大量内容的相关句子进行检索和以明确的额外信息增加对话,应对知识基础挑战; 然而,尽管取得了成功,但现有工作在推论效率方面有缺点; 本文提议了《知识专家》,这是一个框架,可以绕过明确的检索过程,并将知识注入有轻量适应器的经过培训的语言模式,并适应知识基础的对话任务; 根据我们的知识,这是第一次试图在公开讨论的情景下,在不进行检索的情况下应对这一挑战。 实验结果显示,“知识专家”在与一些基于检索的基线相对,同时具有时间效率的推论,显示了我们拟议方法的有效性。