Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems. While promising, the expensive computational costs to obtain solutions often restrict their broader applicability. We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN. However, a naive application of forward-mode AD to conventional PINNs results in higher computation, losing its practical benefit. Therefore, we propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation. SPINN operates on a per-axis basis instead of point-wise processing in conventional PINNs, decreasing the number of network forward passes. Besides, while the computation and memory costs of standard PINNs grow exponentially along with the grid resolution, that of our model is remarkably less susceptible, mitigating the curse of dimensionality. We demonstrate the effectiveness of our model in various PDE systems by significantly reducing the training run-time while achieving comparable accuracy. Project page: \url{https://jwcho5576.github.io/spinn/}
翻译:物理知情神经网络(PINNs)是数据驱动的新的前向和反向问题的PDE解答器。虽然很有希望,但获得解决方案的昂贵计算成本往往限制其更广泛的适用性。我们证明,在培训PINN时,利用前模式AD(AD)进行自动分化(AD)可以大大减少计算。然而,对常规PINNs使用前方模式AD的天真的应用会提高计算能力,从而降低其实际效益。因此,我们提议建立一个网络结构,称为Separable PINN(SPINN),它可以促进前方模式AD,以便更有效地计算。SPINN在常规PINNs中,以每轴处理方式运作,而不是按点法处理,减少网络前方传输次数。此外,标准的PINN的计算和记忆成本随着网格分辨率的急剧增长,我们的模型则明显不那么容易受感染,减轻了维度的诅咒。我们通过大大缩短培训运行时间,同时实现可比的准确性,在各种PDES系统中展示了我们的模型的有效性。项目网页:\urlumas/jwgyus/jongs.5576。