Recent work in scientific machine learning (SciML) has focused on incorporating partial differential equation (PDE) information into the learning process. Much of this work has focused on relatively ``easy'' PDE operators (e.g., elliptic and parabolic), with less emphasis on relatively ``hard'' PDE operators (e.g., hyperbolic). Within numerical PDEs, the latter problem class requires control of a type of volume element or conservation constraint, which is known to be challenging. Delivering on the promise of SciML requires seamlessly incorporating both types of problems into the learning process. To address this issue, we propose ProbConserv, a framework for incorporating conservation constraints into a generic SciML architecture. To do so, ProbConserv combines the integral form of a conservation law with a Bayesian update. We provide a detailed analysis of ProbConserv on learning with the Generalized Porous Medium Equation (GPME), a widely-applicable parameterized family of PDEs that illustrates the qualitative properties of both easier and harder PDEs. ProbConserv is effective for easy GPME variants, performing well with state-of-the-art competitors; and for harder GPME variants it outperforms other approaches that do not guarantee volume conservation. ProbConserv seamlessly enforces physical conservation constraints, maintains probabilistic uncertainty quantification (UQ), and deals well with shocks and heteroscedasticities. In each case, it achieves superior predictive performance on downstream tasks.
翻译:科学机器学习(SciML)的近期工作侧重于将部分差异方程式(PDE)信息纳入学习过程,其中很多工作侧重于相对“容易”的PDE操作员(例如椭圆形和抛线型),而较少强调相对“硬”的PDE操作员(例如双曲型)。在数字PDE中,后一类问题要求控制一种已知具有挑战性的总量元素或保护制约。实现SciML的承诺需要将两种类型的问题无缝地纳入学习过程。为了解决这一问题,我们提议ProbConserv,这是将保护限制纳入通用的SciML结构的一个框架。要做到这一点,ProbConserv把保护法的整体形式与Bayesian更新结合起来。我们详细分析了ProbConservorm关于学习的ProbConservoration(GPME),这是广泛适用的PDS的参数组合,显示PDE的质质质性特性。 ProbConservorvald, Proforal dealalalalalalals ex ex ex exisals, ex foralizal ex ex ex ex ex ex ex exislations millables exislalds 也有效, 一种更稳性地说得更稳性地, 。