Answering complex questions that require multi-step multi-type reasoning over raw text is challenging, especially when conducting numerical reasoning. Neural Module Networks(NMNs), follow the programmer-interpreter framework and design trainable modules to learn different reasoning skills. However, NMNs only have limited reasoning abilities, and lack numerical reasoning capability. We up-grade NMNs by: (a) bridging the gap between its interpreter and the complex questions; (b) introducing addition and subtraction modules that perform numerical reasoning over numbers. On a subset of DROP, experimental results show that our proposed methods enhance NMNs' numerical reasoning skills by 17.7% improvement of F1 score and significantly outperform previous state-of-the-art models.
翻译:回答需要对原始文本进行多步多型推理的复杂问题具有挑战性,特别是在进行数字推理时。神经模块网络(NMNN)遵循程序员解释框架,设计可训练的模块以学习不同的推理技能。然而,NMN只有有限的推理能力,缺乏数字推理能力。我们升级NMN的方式是:(a) 缩小解释员与复杂问题之间的差距;(b) 引入对数字推理超过数字的增减模块。在DROP的子集中,实验结果显示,我们提出的方法提高了NMN的数值推理技能,使F1分提高了17.7%,大大超过了以前的最先进的模型。