Physics-informed neural networks (PINNs) as a means of discretizing partial differential equations (PDEs) are garnering much attention in the Computational Science and Engineering (CS&E) world. At least two challenges exist for PINNs at present: an understanding of accuracy and convergence characteristics with respect to tunable parameters and identification of optimization strategies that make PINNs as efficient as other computational science tools. The cost of PINNs training remains a major challenge of Physics-informed Machine Learning (PiML) - and, in fact, machine learning (ML) in general. This paper is meant to move towards addressing the latter through the study of PINNs on new tasks, for which parameterized PDEs provides a good testbed application as tasks can be easily defined in this context. Following the ML world, we introduce metalearning of PINNs with application to parameterized PDEs. By introducing metalearning and transfer learning concepts, we can greatly accelerate the PINNs optimization process. We present a survey of model-agnostic metalearning, and then discuss our model-aware metalearning applied to PINNs as well as implementation considerations and algorithmic complexity. We then test our approach on various canonical forward parameterized PDEs that have been presented in the emerging PINNs literature.
翻译:物理知情神经网络(PINNs)作为分解部分差异方程式的一种手段,在计算科学和工程(CS&E)世界中正引起人们的极大关注。目前,PINN至少存在两个挑战:了解金枪鱼参数的准确性和趋同性,确定优化战略,使PINNs与其他计算科学工具一样高效。PINNs培训的费用仍然是物理知情机器学习(PimML)和机器学习(ML)的重大挑战。本文意在通过对PINNs进行新任务研究,解决后者,为此,参数化PDEs提供了良好的测试应用,因为在此范围内很容易界定任务。在ML世界之后,我们引入PINNS的金属学习和转移学习概念,我们可以大大加快PINNs的优化进程。我们提出了模型化金属学习调查,然后讨论了我们模型化的数学模型化方法,然后又将我们的新模型化的数学参数应用到PIN的复杂程度测试中。