To diversify and enrich generated dialogue responses, knowledge-grounded dialogue has been investigated in recent years. Despite the success of the existing methods, they mainly follow the paradigm of retrieving the relevant sentences over a large corpus and augment the dialogues with explicit extra information, which is time- and resource-consuming. In this paper, we propose KnowExpert, an end-to-end framework to bypass the retrieval process by injecting prior knowledge into the pre-trained language models with lightweight adapters. To the best of our knowledge, this is the first attempt to tackle this task relying solely on a generation-based approach. Experimental results show that KnowExpert performs comparably with the retrieval-based baselines, demonstrating the potential of our proposed direction.
翻译:为使对话反应多样化和丰富内容,近年来对基于知识的对话进行了调查,尽管现有方法取得了成功,但它们主要遵循了对大体重判相关句子的范式,并以明确的额外信息增加对话,这既耗时又耗资。在本文件中,我们提议“知识专家”这一端对端框架,通过将事先知识注入有轻量适应器的经过培训的语言模式,绕过检索进程。根据我们的知识,这是第一次试图完全依靠一代人的方法来完成这项任务。实验结果显示,知识专家与基于检索的基线相对应,显示了我们拟议方向的潜力。