Learning the underlying equation from data is a fundamental problem in many disciplines. Recent advances rely on Neural Networks (NNs) but do not provide theoretical guarantees in obtaining the exact equations owing to the non-convexity of NNs. In this paper, we propose Convex Neural Symbolic Learning (CoNSoLe) to seek convexity under mild conditions. The main idea is to decompose the recovering process into two steps and convexify each step. In the first step of searching for right symbols, we convexify the deep Q-learning. The key is to maintain double convexity for both the negative Q-function and the negative reward function in each iteration, leading to provable convexity of the negative optimal Q function to learn the true symbol connections. Conditioned on the exact searching result, we construct a Locally Convex equation Learner (LoCaL) neural network to convexify the estimation of symbol coefficients. With such a design, we quantify a large region with strict convexity in the loss surface of LoCaL for commonly used physical functions. Finally, we demonstrate the superior performance of the CoNSoLe framework over the state-of-the-art on a diverse set of datasets.
翻译:从数据中学习基本等式是许多学科的根本问题。 最近的进步取决于神经网络(NNs), 但由于NNs的非混杂性, 没有提供理论保障来获取准确的等式。 在本文中, 我们提议 Convex神经符号学习( CoNSoLe) 在温和的条件下寻求解密性。 主要的想法是将恢复过程分解为两个步骤, 并分解每个步骤。 在寻找正确符号的第一步中, 我们将深Q- 学习混为一体。 关键在于保持负Q 功能和负奖励功能的双重共性, 从而在每次迭代中保持双倍的共性, 从而导致负最佳Q函数( CONSOL) 的可调和性, 以了解真实的符号连接。 在精确的搜索结果上, 我们构建一个本地Convex 等式学习器( LOCAL) 神经网络, 以解析符号系数的估算。 在设计中, 我们量化一个大区域, 并严格地在LoCLLLLLL 物理框架损失表面上的高级性功能。