Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems. While promising, the expensive computational costs to obtain solutions often restrict their broader applicability. We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN. However, a naive application of forward-mode AD to conventional PINNs results in higher computation, losing its practical benefit. Therefore, we propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation. SPINN operates on a per-axis basis instead of point-wise processing in conventional PINNs, decreasing the number of network forward passes. Besides, while the computation and memory costs of standard PINNs grow exponentially along with the grid resolution, that of our model is remarkably less susceptible, mitigating the curse of dimensionality. We demonstrate the effectiveness of our model in various PDE systems by significantly reducing the training run-time while achieving comparable accuracy. Project page: https://jwcho5576.github.io/spinn/
翻译:物理知情神经网络(PINNs)是数据驱动的新的前向和反向问题PDE解答器。虽然前景大有希望,但获得解决方案的昂贵计算成本往往限制其更广泛的适用性。我们证明,在培训PINN时,利用前方模式AD(AD)可以大大降低自动差异计算。然而,将前方模式AD应用于常规PINN(PINN)的天真的应用会提高计算能力,从而降低其实际效益。因此,我们提议建立一个网络结构,称为Separable PINN(SPINN),它可以促进前方模式AD(SPINN),以便更有效地计算。SPINN在常规PINNs(AD)中,以每轴法处理方式运作,而不是按点法处理,减少网络前方通道的数量。此外,标准的PINN的计算和记忆成本随着电网分辨率的急剧增长,而我们的模型则非常不那么容易受感染,从而减轻了维度的诅咒。我们通过大量减少培训运行时间,从而在各种PDEN系统中证明我们的模型的有效性。我们的项目网页: http://jw5576/ngivos.ios。