Open-domain conversational search assistants aim at answering user questions about open topics in a conversational manner. In this paper we show how the Transformer architecture achieves state-of-the-art results in key IR tasks, leveraging the creation of conversational assistants that engage in open-domain conversational search with single, yet informative, answers. In particular, we propose an open-domain abstractive conversational search agent pipeline to address two major challenges: first, conversation context-aware search and second, abstractive search-answers generation. To address the first challenge, the conversation context is modeled with a query rewriting method that unfolds the context of the conversation up to a specific moment to search for the correct answers. These answers are then passed to a Transformer-based re-ranker to further improve retrieval performance. The second challenge, is tackled with recent Abstractive Transformer architectures to generate a digest of the top most relevant passages. Experiments show that Transformers deliver a solid performance across all tasks in conversational search, outperforming the best TREC CAsT 2019 baseline.
翻译:开放式对话搜索助理旨在以对话方式回答用户关于开放主题的问题。 在本文中, 我们展示了“ 变换器” 结构如何在关键 IR 任务中实现最新艺术成果, 利用创建对话助理, 与单一但信息丰富的答案进行开放式对话搜索。 特别是, 我们提出一个开放式的抽象对话搜索代理管道, 以应对两大挑战 : 首先, 对话背景搜索, 第二, 抽象的搜索答案 。 为了应对第一个挑战, 对话背景以查询重写方法为模型, 将对话背景显示为寻找正确答案的具体时刻。 这些答案随后传递给基于变换器的重新排序者, 以进一步提高检索性能。 第二个挑战则由最新的“ 抽象变换器” 结构来解决, 以生成最相关段落的文摘。 实验显示, 变换器在对话搜索中提供所有任务的可靠性能, 超过最佳的 TREC CAST 2019 基线 。