This paper explores the difficulties in solving partial differential equations (PDEs) using physics-informed neural networks (PINNs). PINNs use physics as a regularization term in the objective function. However, a drawback of this approach is the requirement for manual hyperparameter tuning, making it impractical in the absence of validation data or prior knowledge of the solution. Our investigations of the loss landscapes and backpropagated gradients in the presence of physics reveal that existing methods produce non-convex loss landscapes that are hard to navigate. Our findings demonstrate that high-order PDEs contaminate backpropagated gradients and hinder convergence. To address these challenges, we introduce a novel method that bypasses the calculation of high-order derivative operators and mitigates the contamination of backpropagated gradients. Consequently, we reduce the dimension of the search space and make learning PDEs with non-smooth solutions feasible. Our method also provides a mechanism to focus on complex regions of the domain. Besides, we present a dual unconstrained formulation based on Lagrange multiplier method to enforce equality constraints on the model's prediction, with adaptive and independent learning rates inspired by adaptive subgradient methods. We apply our approach to solve various linear and non-linear PDEs.
翻译:本文探讨了利用物理知情神经网络(PINNs)解决部分差异方程式(PDEs)的困难。 PINNs将物理作为目标功能的正规化术语。然而,这一方法的一个缺点是需要人工超参数调整,在缺乏验证数据或先前对解决方案的了解的情况下使之不切实际。我们在物理学面前对损失地貌和反相向梯度的调查表明,现有方法产生非电解损失地貌,难以浏览。我们的研究结果表明,高阶PDEs污染了反向再造梯度并阻碍趋同。为了应对这些挑战,我们采用了一种新颖的方法,绕过高阶衍生机操作者的计算,并减轻反向再造梯度的污染。因此,我们缩小了搜索空间的尺寸,使学习带有非移动式解决方案的PDEs成为可行的机制。我们的方法还提供了一种机制,以复杂的域区域为重点。此外,我们提出了一种基于拉格差的倍增倍法的配方法,以在模型预测中实施平等限制,我们采用适应性和独立性的研究率,我们采用各种适应性和适应性的方法。</s>