We present a novel class of approximations for variational losses, being applicable for the training of physics-informed neural nets (PINNs). The loss formulation reflects classic Sobolev space theory for partial differential equations and their weak formulations. The loss computation rests on an extension of Gauss-Legendre cubatures, we term Sobolev cubatures, replacing automatic differentiation (A.D.). We prove the runtime complexity of training the resulting Soblev-PINNs (SC-PINNs) to be less than required by PINNs relying on A.D. On top of one-to-two order of magnitude speed-up the SC-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse PDE problems than established PINNs achieve.
翻译:我们提出了一套关于变式损失的新型近似值,适用于物理学知情神经网的培训,损失配方反映了传统的Sobolev空间理论,用于局部差异方程式及其微弱配方。损失计算基于高斯-莱格德雷幼崽的延伸,我们使用索博勒夫幼崽,取代自动区分(A.D.)。我们证明,由此产生的Soblev-PINN(SC-PINN)培训的运行时间复杂性低于依赖A.D的PINN的要求。 除了一至二级规模速度加快外,SC-PINN还证明,SC-PINN能够实现比既定的PINN所实现的显著前方和反面PDE问题更接近的解决方案。