Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs). However, they often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias. In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under stochastic gradient descent with momentum (SGDM). This demonstrates SGDM significantly reduces the effect of spectral bias. We have also examined why training a model via the Adam optimizer can accelerate the convergence while reducing the spectral bias. Moreover, our numerical experiments have confirmed that wide-enough networks using SGDM still converge to desirable solutions, even in the presence of high-frequency features. In fact, we show that the width of a network plays a critical role in convergence.
翻译:物理知情神经网络算法(PINN)在解决涉及部分差异方程式(PDEs)的广泛问题方面已显示出可喜的成果,然而,由于光谱偏差现象,目标功能含有高频特性,往往无法趋于理想的解决办法;在目前的工作中,我们利用神经相近内核(NTKs)来调查在随机梯度梯度下降(SGDM)下变化的PINNs的培训动态。这表明SGDM大大降低了光谱偏差的影响。我们还研究了为什么通过亚当优化器培训一个模型可以加速趋同,同时减少光谱偏差。此外,我们的数字实验还证实,即使存在高频特性,使用SGDM的宽广网络仍然趋于理想的解决办法。事实上,我们表明,网络的宽度在趋同方面起着关键的作用。