Physics-informed neural networks (PINNs) have proven a suitable mathematical scaffold for solving inverse ordinary (ODE) and partial differential equations (PDE). Typical inverse PINNs are formulated as soft-constrained multi-objective optimization problems with several hyperparameters. In this work, we demonstrate that inverse PINNs can be framed in terms of maximum-likelihood estimators (MLE) to allow explicit error propagation from interpolation to the physical model space through Taylor expansion, without the need of hyperparameter tuning. We explore its application to high-dimensional coupled ODEs constrained by differential algebraic equations that are common in transient chemical and biological kinetics. Furthermore, we show that singular-value decomposition (SVD) of the ODE coupling matrices (reaction stoichiometry matrix) provides reduced uncorrelated subspaces in which PINNs solutions can be represented and over which residuals can be projected. Finally, SVD bases serve as preconditioners for the inversion of covariance matrices in this hyperparameter-free robust application of MLE to ``kinetics-informed neural networks''.
翻译:基于物理的神经网络(PINNs)已经被证明是解反向常微分方程(ODE)和偏微分方程(PDE)的合适数学方法。典型的反向PINNs被构造为具有几个超参数的软约束多目标优化问题。在这项工作中,我们证明了反向PINNs可以用最大似然估计(MLE)来表述,通过Taylor展开,从插值到物理模型空间实现显式的误差传播,无需超参数调节。我们探讨了其在瞬态化学和生物动力学中常见的受微分代数方程约束的高维耦合ODE的应用。此外,我们展示了ODE耦合矩阵(反应计量矩阵)的奇异值分解(SVD)提供了降低相关子空间,可以表示PINNs解,并且在这些子空间上可以进行残差投影。最后,SVD基是协方差矩阵反演的预处理器,从而实现MLE在“动力学信息神经网络”中的无超参数稳健应用。