Recent years have seen significant advancement in text generation tasks with the help of neural language models. However, there exists a challenging task: generating math problem text based on mathematical equations, which has made little progress so far. In this paper, we present a novel equation-to-problem text generation model. In our model, 1) we propose a flexible scheme to effectively encode math equations, we then enhance the equation encoder by a Varitional Autoen-coder (VAE) 2) given a math equation, we perform topic selection, followed by which a dynamic topic memory mechanism is introduced to restrict the topic distribution of the generator 3) to avoid commonsense violation in traditional generation model, we pretrain word embedding with background knowledge graph (KG), and we link decoded words to related words in KG, targeted at injecting background knowledge into our model. We evaluate our model through both automatic metrices and human evaluation, experiments demonstrate our model outperforms baseline and previous models in both accuracy and richness of generated problem text.
翻译:近些年来,在神经语言模型的帮助下,在文本生成任务方面取得了显著进步。然而,目前存在着一项具有挑战性的任务:在数学方程的基础上生成数学问题文本,迄今为止进展甚微。在本文中,我们提出了一个新颖的方程式到问题文本生成模型。在模型中,1)我们提出了一个灵活计划,以有效地将数学方程式编码,然后我们用一个变式自动编码(VAE)2来强化方程式编码编码,然后根据一个数学方程式,我们进行主题选择,然后引入一个动态专题记忆机制,以限制生成者的专题分布 3,以避免传统生成模型中常见的违反现象 3, 将我们预设的字词嵌入背景知识图表(KG),我们将KG中的相关词与解码词联系起来, 目的是将背景知识注入我们的模型。我们通过自动矩阵和人文评估来评估我们的模型,实验显示我们的模型在生成的问题文本的准确性和丰富性两方面都超越了基线和以前的模型。