Physics-Informed Neural Network (PINN) has proven itself a powerful tool to obtain the numerical solutions of nonlinear partial differential equations (PDEs) leveraging the expressivity of deep neural networks and the computing power of modern heterogeneous hardware. However, its training is still time-consuming, especially in the multi-query and real-time simulation settings, and its parameterization often overly excessive. In this paper, we propose the Generative Pre-Trained PINN (GPT-PINN) to mitigate both challenges in the setting of parametric PDEs. GPT-PINN represents a brand-new meta-learning paradigm for parametric systems. As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons. Moreover, its activation function at each hidden neuron is a (full) PINN pre-trained at a judiciously selected system configuration. The meta-network adaptively ``learns'' the parametric dependence of the system and ``grows'' this hidden layer one neuron at a time. In the end, by encompassing a very small number of networks trained at this set of adaptively-selected parameter values, the meta-network is capable of generating surrogate solutions for the parametric system across the entire parameter domain accurately and efficiently.
翻译:物理信息神经网络(PINN)已经被证明是一种有效的工具,利用深度神经网络的表达能力和现代异构硬件的计算能力,获得非线性偏微分方程(PDE)的数值解。然而,在多查询和实时模拟设置下,它的训练仍然耗时,并且其参数化通常过于复杂。在本文中,我们提出了生成预训练PINN(GPT-PINN)来解决参数化PDEs场景下的这两个挑战。GPT-PINN代表了一种全新的参数化系统的元学习范式。作为一个网络,并集,它的外部(元)网络具有仅有一个隐藏层的超简化形式,隐藏层神经元数量显著减少。此外,它在每个隐藏神经元的激活函数是一个在明智选择的系统配置下预先训练好的(全)PINN。元网络自适应地“学习”系统的参数依赖关系,并逐个神经元地“增长”该隐藏层。最终,通过包含在此一组自适应选择的参数值上训练的极少数网络,元网络能够准确高效地生成参数化系统的代理解,涵盖整个参数域。