Understanding and creating mathematics using natural mathematical language - the mixture of symbolic and natural language used by humans - is a challenging and important problem for driving progress in machine learning. As a step in this direction, we develop NaturalProofs, a multi-domain corpus of mathematical statements and their proofs, written in natural mathematical language. NaturalProofs unifies broad coverage, deep coverage, and low-resource mathematical sources, allowing for evaluating both in-distribution and zero-shot generalization. Using NaturalProofs, we benchmark strong neural methods on mathematical reference retrieval and generation tasks which test a system's ability to determine key results that appear in a proof. Large-scale sequence models show promise compared to classical information retrieval methods, yet their performance and out-of-domain generalization leave substantial room for improvement. NaturalProofs opens many avenues for research on challenging mathematical tasks.
翻译:使用自然数学语言,即人类使用的符号语言和自然语言的混合体,理解和创造数学是推动机器学习进步的一个具有挑战性和重要的问题。作为朝这个方向迈出的一步,我们开发了自然Proofs,这是一个以自然数学语言书写的数学陈述及其证明的多领域丛书。自然Profs 统一了广泛的覆盖范围、深度覆盖面和低资源数学来源,从而可以对分布和零截面的概括性进行评价。我们使用自然Proofs,将强大的神经方法作为数学参考检索和生成任务的基准,以测试系统确定证据中出现的关键结果的能力。大型序列模型显示与古典信息检索方法相比的前景,但它们的性能和外部一般化留下了很大的改进空间。自然Profs为挑战性数学任务的研究开辟了许多途径。