In this paper, we propose the augmented physics-informed neural network (APINN), which adopts soft and trainable domain decomposition and flexible parameter sharing to further improve the extended PINN (XPINN) as well as the vanilla PINN methods. In particular, a trainable gate network is employed to mimic the hard decomposition of XPINN, which can be flexibly fine-tuned for discovering a potentially better partition. It weight-averages several sub-nets as the output of APINN. APINN does not require complex interface conditions, and its sub-nets can take advantage of all training samples rather than just part of the training data in their subdomains. Lastly, each sub-net shares part of the common parameters to capture the similar components in each decomposed function. Furthermore, following the PINN generalization theory in Hu et al. [2021], we show that APINN can improve generalization by proper gate network initialization and general domain & function decomposition. Extensive experiments on different types of PDEs demonstrate how APINN improves the PINN and XPINN methods. Specifically, we present examples where XPINN performs similarly to or worse than PINN, so that APINN can significantly improve both. We also show cases where XPINN is already better than PINN, so APINN can still slightly improve XPINN. Furthermore, we visualize the optimized gating networks and their optimization trajectories, and connect them with their performance, which helps discover the possibly optimal decomposition. Interestingly, if initialized by different decomposition, the performances of corresponding APINNs can differ drastically. This, in turn, shows the potential to design an optimal domain decomposition for the differential equation problem under consideration.
翻译:在本文中,我们建议扩大物理学知情神经网络(APINN),它采用软的和可训练的域分解和灵活的参数共享,以进一步改进扩展的 PINN (XPINN) 和香草 PINN 方法。特别是,使用可训练的门网络来模仿 XPINN 的硬分解,这个网络可以灵活地微调,以发现一个可能更好的分区。它作为 APINN 的输出,平均使用几个子网。APINN 不需要复杂的接口条件,它的子网可以利用所有的培训样本,而不仅仅是其子网中最佳培训数据的一部分。最后,每个子网都分享共同参数的一部分,以捕捉每个分解功能中的类似组件。此外,在Hu 和 al. [2021] 的 PINNPNPN 理论中,我们显示APNN可以通过适当的门网络初始化和一般域和功能分解配置来改进总体化。 不同种类的PINNPI 实验表明,我们如何改进PNN 设计网络, 也能够更精确地显示其预变现的PIN 和不断变异化。