Given ample experimental data from a system governed by differential equations, it is possible to use deep learning techniques to construct the underlying differential operators. In this work we perform symbolic discovery of differential operators in a situation where there is sparse experimental data. This small data regime in machine learning can be made tractable by providing our algorithms with prior information about the underlying dynamics. Physics Informed Neural Networks (PINNs) have been very successful in this regime (reconstructing entire ODE solutions using only a single point or entire PDE solutions with very few measurements of the initial condition). We modify the PINN approach by adding a neural network that learns a representation of unknown hidden terms in the differential equation. The algorithm yields both a surrogate solution to the differential equation and a black-box representation of the hidden terms. These hidden term neural networks can then be converted into symbolic equations using symbolic regression techniques like AI Feynman. In order to achieve convergence of these neural networks, we provide our algorithms with (noisy) measurements of both the initial condition as well as (synthetic) experimental data obtained at later times. We demonstrate strong performance of this approach even when provided with very few measurements of noisy data in both the ODE and PDE regime.
翻译:根据由差异方程式管理的系统所提供的大量实验数据,可以使用深层次的学习技术来构建基本的差别操作者。在这项工作中,我们在一个实验数据稀少的情况下,对不同的操作者进行象征性的发现。机器学习中的这种小数据制度可以通过提供我们关于基本动态的算法来加以推广。物理信息化神经网络(PINNS)在这个制度中非常成功(只使用单点或整个PDE解决方案来重建整个ODE解决方案,对初始条件进行极少的测量)。我们修改PINN方法,增加一个神经网络,在差异方程式中学习未知的隐藏术语的表示方式。算法产生一种对差异方程式的替代解决方案和隐藏术语的黑盒表示方式。这些隐藏的线性网络随后可以使用AI Feynman这样的象征性回归技术转换为象征性的等式。为了实现这些神经网络的趋同,我们提供了对初始条件的(nois)测量方法,以及(合成的)实验数据在后期获得的。我们展示了这种数据的强劲性能,即使提供了这种数据,也展示了这种系统。