We formulate a class of physics-driven deep latent variable models (PDDLVM) to learn parameter-to-solution (forward) and solution-to-parameter (inverse) maps of parametric partial differential equations (PDEs). Our formulation leverages the finite element method (FEM), deep neural networks, and probabilistic modeling to assemble a deep probabilistic framework in which the forward and inverse maps are approximated with coherent uncertainty quantification. Our probabilistic model explicitly incorporates a parametric PDE-based density and a trainable solution-to-parameter network while the introduced amortized variational family postulates a parameter-to-solution network, all of which are jointly trained. Furthermore, the proposed methodology does not require any expensive PDE solves and is physics-informed only at training time, which allows real-time emulation of PDEs and generation of inverse problem solutions after training, bypassing the need for FEM solve operations with comparable accuracy to FEM solutions. The proposed framework further allows for a seamless integration of observed data for solving inverse problems and building generative models. We demonstrate the effectiveness of our method on a nonlinear Poisson problem, elastic shells with complex 3D geometries, and integrating generic physics-informed neural networks (PINN) architectures. We achieve up to three orders of magnitude speed-ups after training compared to traditional FEM solvers, while outputting coherent uncertainty estimates.
翻译:我们设计了一组物理驱动的深潜潜伏模型(PDDLVM),以学习参数到解(前方)和参数到参数的溶解(反向)地图(PDEs)。我们的配方利用了有限元素法(FEM)、深神经网络和概率模型,以形成一个深概率框架,使前方和反向地图大致具有一致性不确定性量化。我们的概率模型明确包括了参数PDE基密度和可训练的对参数的溶解网络,而引入的摊销变式家庭假设则设置了一个参数到溶解网(反向),所有这些都是经过联合培训的。此外,拟议的方法不需要任何昂贵的PDE解决方案,而且只在培训时间才了解物理,这样可以实时模拟PDEs,在培训后产生反向问题解决方案,绕过对FEMM解决方案的类似精确度操作。拟议的框架还允许将观测到的数据与反向反向问题和构建直流分析型模型的精确度模型。我们展示了一种不相近的数据整合的方法。