We present an approach for systematic reasoning that produces human interpretable proof trees grounded in a factbase. Our solution resembles the style of a classic Prolog-based inference engine, where we replace handcrafted rules through a combination of neural language modeling, guided generation, and semiparametric dense retrieval. This novel reasoning engine, NELLIE, dynamically instantiates interpretable inference rules that capture and score entailment (de)compositions over natural language statements. NELLIE provides competitive performance on scientific QA datasets requiring structured explanations over multiple facts.
翻译:我们提出的系统推理方法可以产生基于事实基础的人类可解释的证明树。 我们的解决方案类似于经典的基于Prolog的推理引擎,我们通过神经语言模型、制导生成和半对称密度检索的组合来取代手工艺规则。 这个新的推理引擎NELLIE动态地即时解释可解释的推理规则可以捕捉和计分自然语言说明(de)的分数。 NELLIE为科学QA数据集提供了竞争性的性能,需要针对多种事实进行有条理的解释。