Conversational question generation (CQG) serves as a vital task for machines to assist humans, such as interactive reading comprehension, through conversations. Compared to traditional single-turn question generation (SQG), CQG is more challenging in the sense that the generated question is required not only to be meaningful, but also to align with the occurred conversation history. While previous studies mainly focus on how to model the flow and alignment of the conversation, there has been no thorough study to date on which parts of the context and history are necessary for the model. We argue that shortening the context and history is crucial as it can help the model to optimise more on the conversational alignment property. To this end, we propose CoHS-CQG, a two-stage CQG framework, which adopts a CoHS module to shorten the context and history of the input. In particular, CoHS selects contiguous sentences and history turns according to their relevance scores by a top-p strategy. Our model achieves state-of-the-art performances on CoQA in both the answer-aware and answer-unaware settings.
翻译:对话问题生成(CQG)是帮助人类的机器的一项重要任务,例如互动阅读理解,通过对话来帮助人类。与传统的单向问题生成(SQG)相比,CQG更具挑战性,因为所产生问题不仅需要有意义,而且需要与所发生对话历史保持一致。虽然先前的研究主要侧重于如何模拟对话的流量和一致性,但迄今为止还没有对模式需要哪些内容和历史进行彻底的研究。我们认为,缩短背景和历史至关重要,因为它能帮助模型在对口匹配属性上更优化。为此,我们提议CHS-CQG,这是一个两阶段的CQG框架,采用COHS模块来缩短投入的背景和历史。特别是,CHS选择了连接的句子和历史翻转,按最高策略按其相关评分进行。我们的模型在答应答和应答环境中都取得了COQA的最新表现。