An important inference from Neural Tangent Kernel (NTK) theory is the existence of spectral bias (SB), that is, low frequency components of the target function of a fully connected Artificial Neural Network (ANN) being learnt significantly faster than the higher frequencies during training. This is established for Mean Square Error (MSE) loss functions with very low learning rate parameters. Physics Informed Neural Networks (PINNs) are designed to learn the solutions of differential equations (DE) of arbitrary orders; in PINNs the loss functions are obtained as the residues of the conservative form of the DEs and represent the degree of dissatisfaction of the equations. So there has been an open question whether (a) PINNs also exhibit SB and (b) if so, how does this bias vary across the orders of the DEs. In this work, a series of numerical experiments are conducted on simple sinusoidal functions of varying frequencies, compositions and equation orders to investigate these issues. It is firmly established that under normalized conditions, PINNs do exhibit strong spectral bias, and this increases with the order of the differential equation.
翻译:Neural Tangent Kernel(NTK)理论的一个重要推论是存在光谱偏差(SB),即完全相连的人工神经网络(ANN)目标功能的低频率组成部分在培训期间的学习速度比高频率要快得多。这是针对学习率参数非常低的中位方错误(MSE)损失功能而建立的。物理知情神经网络(PINN)旨在学习任意命令差异方程式(DE)的解决方案;在PINNs中,损失功能是作为保守形式的DE的残余获得的,代表了对等式的不满程度。因此,一直有一个未决问题,即(a) PINNs是否也表现出SB和(b),如果是,这种偏差在DE的顺序中如何有所不同。在这项工作中,对不同频率、组成和方程式等式等式等式的简单类功能进行了一系列数字实验,以调查这些问题。在标准化条件下,PINNs确实表现出强烈的光谱偏差,而且这种偏差的顺序也随之增加。