Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training efficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PINNs can be trained with less labeled data or even without any labeled data by adding partial differential equations (PDEs) as a penalty term into the loss function. Inspired by this idea, we propose the new Reptile initialization to sample more tasks from the parameterized PDEs and adapt the penalty term of the loss. The new Reptile initialization can acquire initialization parameters from related tasks by supervised, unsupervised, and semi-supervised learning. Then, PINNs with initialization parameters can efficiently solve PDEs. Besides, the new Reptile initialization can also be used for the variants of PINNs. Finally, we demonstrate and verify the NRPINN considering both forward problems, including solving Poisson, Burgers, and Schr\"odinger equations, as well as inverse problems, where unknown parameters in the PDEs are estimated. Experimental results show that the NRPINN training is much faster and achieves higher accuracy than PINNs with other initialization methods.
翻译:物理知情神经网络(PINNs)被广泛用于解决各种科学计算问题。然而,大量培训成本限制一些实时应用程序的PINNs。虽然一些工程是为了提高PINNs的培训效率而提出的,但很少考虑初始化的影响。为此,我们提议采用基于物理成形神经网络(NIPIN)的新微缩初始化初始化(NPINs) 。原始的Reptile算法是一种基于标签数据的元学习初始化方法。随后,带有初始化参数的PINNs可以通过在损失函数中添加部分差分方程式(PDEs)来培训,甚至没有任何标签数据。此外,根据这个想法,我们提议采用新的微缩初始化初始化,从参数中抽取更多的参数,从参数参数参数参数参数化出来,并调整损失的罚款期限。新微缩缩缩缩放初始化算法可以从相关任务中获得初始化参数,通过监管、不监管和半超导化的学习获得。然后,带有初始化参数的PENN可以有效地解算出初始化参数(PND初始化)的初始化结果。此外的初始化精化过程的精化结果也用来验证了我们用来验证了前方程式的模型的模型的模型,用来验证。