Physics-informed neural networks (PINNs) have proven a suitable mathematical scaffold for solving inverse ordinary (ODE) and partial differential equations (PDE). Typical inverse PINNs are formulated as soft-constrained multi-objective optimization problems with several hyperparameters. In this work, we demonstrate that inverse PINNs can be framed in terms of maximum-likelihood estimators (MLE) to allow explicit error propagation from interpolation to the physical model space through Taylor expansion, without the need of hyperparameter tuning. We explore its application to high-dimensional coupled ODEs constrained by differential algebraic equations that are common in transient chemical and biological kinetics. Furthermore, we show that singular-value decomposition (SVD) of the ODE coupling matrices (reaction stoichiometry matrix) provides reduced uncorrelated subspaces in which PINNs solutions can be represented and over which residuals can be projected. Finally, SVD bases serve as preconditioners for the inversion of covariance matrices in this hyperparameter-free robust application of MLE to ``kinetics-informed neural networks''.
翻译:物理知识引导下的神经网络(PINNs)被证明可以作为解决反向常微分方程(ODE)和偏微分方程(PDE)的合适数学基础。典型的反向PINNs被构建为带有几个超参数的软约束多目标优化问题。本研究表明,反向PINNs可以通过最大似然估计器(MLE)的形式来建模,从而实现从插值到物理模型空间的误差传播,无需进行超参数调整。我们探索了其在反应动力学中常见的高维耦合ODE约束问题中的应用,并证明了ODE耦合矩阵(反应化学计量矩阵)的奇异值分解(SVD)提供了可以表示PINNs解的降维的不相关子空间,可以在其上投影残差。最后,SVD基函数作为协方差矩阵反演的预处理器,实现了MLE在“反应动力学知识引导下的神经网络”中的无超参数的鲁棒应用。