Physics-Informed Neural Network (PINN) has become a commonly used machine learning approach to solve partial differential equations (PDE). But, facing high-dimensional secondorder PDE problems, PINN will suffer from severe scalability issues since its loss includes second-order derivatives, the computational cost of which will grow along with the dimension during stacked back-propagation. In this work, we develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks. In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation. We further discuss the model capacity and provide variance reduction methods to address key limitations in the derivative estimation. Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is significantly faster. Our code is released at https://github.com/LithiumDA/PINN-without-Stacked-BP.
翻译:物理进化神经网络(PINN)已成为解决部分差异方程式(PDE)的常用机械学习方法。 但是,面对高维二阶PDE问题,PINN将面临严重的可缩缩问题,因为其损失包括二阶衍生物,其计算成本将随着堆叠后反插器的尺寸而增长。在这项工作中,我们开发了一种新颖的方法,可以大大加快对物理进化神经网络的培训。特别是,我们用高山平滑的模型将PDE解决方案参数化,并显示根据斯坦的特性,二阶衍生物可以有效计算,而无需反向调整。我们进一步讨论模型能力并提供减少差异的方法,以解决衍生物估计中的关键局限性。实验结果显示,我们拟议的方法与标准的PINN培训相比,可以实现竞争错误,但速度要快得多。我们的代码在https://github.com/LiumDA/PINN-un-Stacked-BPPT上发布。</s>