We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.
翻译:我们引入了一个新的空间透视物理学和数据知情的深层潜伏模型(PDEs),这些模型通过可伸缩的可变神经神经过程运行。我们通过将概率测量方法分配给空间领域来实现这一点,从而使我们能够将合用电网概率作为随机变量处理,从而可以概率地将合用电网作为随机变量处理。调整了这一空间统计观点,我们解决了参数PDE的前方和反向问题,从而导致建造高斯的解决方案领域进程模型。实施这些随机电网为反物理知情深深层学习框架提出了一套独特的挑战,我们提出了一个新的结构,称为“变异共振网络”(GICNets),以克服这些挑战。我们进一步展示了如何以有原则的方式将噪音数据纳入我们的物理知情模型,以改进对可能存在数据但测量位置与任何固定网格或网格不相符的问题的预测。拟议方法在非线性Poisson问题、Burgers方程式和Navier-Stokes方程式等方程式上进行了测试,我们提出了一套新的结构,用以克服这些挑战。我们提供了广泛的量化模型,同时进行广泛的数字比较。我们展示了这些模型,我们展示了这些模型,我们展示了这些模型的计算优势。我们进行了重要的计算。