Physics-Informed Neural Network (PINN) has proven itself a powerful tool to obtain the numerical solutions of nonlinear partial differential equations (PDEs) leveraging the expressivity of deep neural networks and the computing power of modern heterogeneous hardware. However, its training is still time-consuming, especially in the multi-query and real-time simulation settings, and its parameterization often overly excessive. In this paper, we propose the Generative Pre-Trained PINN (GPT-PINN) to mitigate both challenges in the setting of parametric PDEs. GPT-PINN represents a brand-new meta-learning paradigm for parametric systems. As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons. Moreover, its activation function at each hidden neuron is a (full) PINN pre-trained at a judiciously selected system configuration. The meta-network adaptively ``learns'' the parametric dependence of the system and ``grows'' this hidden layer one neuron at a time. In the end, by encompassing a very small number of networks trained at this set of adaptively-selected parameter values, the meta-network is capable of generating surrogate solutions for the parametric system across the entire parameter domain accurately and efficiently.
翻译:物理信息神经网络(PINN)已经被证明是一种强大的工具,利用深度神经网络的表达能力和现代异构硬件的计算能力来获得非线性偏微分方程(PDE)的数值解。然而,它的训练仍然非常耗时,特别是在多查询和实时模拟设置中,并且其参数化通常过度复杂。在本文中,我们提出了生成预训练PINN(GPT-PINN),以在参数PDE的设置中缓解这两个挑战。GPT-PINN代表了参数系统的全新元学习范例。作为一个网络,它的外部/元网络是超级降维的,并且只有一个隐藏层,其神经元数量显著降低。此外,每个隐藏神经元的激活函数是在精心选择的系统配置下预训练的(完整的)PINN。元网络自适应地“学习”系统的参数依赖性,每次一个神经元地“增长”这个隐藏层。最后,通过包含在这组自适应选择的参数值上训练的非常少量的网络,元网络能够准确高效地为参数系统生成代理解决方案。