We introduce an approach for the answer-aware question generation problem. Instead of only relying on the capability of strong pre-trained language models, we observe that the information of answers and questions can be found in some relevant sentences in the context. Based on that, we design a model which includes two modules: a selector and a generator. The selector forces the model to more focus on relevant sentences regarding an answer to provide implicit local information. The generator generates questions by implicitly combining local information from the selector and global information from the whole context encoded by the encoder. The model is trained jointly to take advantage of latent interactions between the two modules. Experimental results on two benchmark datasets show that our model is better than strong pre-trained models for the question generation task. The code is also available (shorturl.at/lV567).
翻译:我们引入了一种方法来解决答案意识问题生成问题。 我们不仅依靠强大的预先培训语言模型的能力,而且注意到答案和问题的信息可以在相关语句中找到。 在此基础上,我们设计了一个包括两个模块的模型: 选择器和生成器。 选择器迫使模型更多地侧重于有关答案的相关句子,以提供隐含的本地信息。 生成器通过隐含地结合由编码器编码的整个背景中来自选择器和全球信息的本地信息生成问题。 该模型经过联合培训,以利用两个模块之间的潜在互动。 两个基准数据集的实验结果显示,我们的模型比在问题生成任务中经过预先培训的强型号要好。 该代码也可用(shorturul.at/lV567)。