In the context of solving inverse problems for physics applications within a Bayesian framework, we present a new approach, Markov Chain Generative Adversarial Neural Networks (MCGANs), to alleviate the computational costs associated with solving the Bayesian inference problem. GANs pose a very suitable framework to aid in the solution of Bayesian inference problems, as they are designed to generate samples from complicated high-dimensional distributions. By training a GAN to sample from a low-dimensional latent space and then embedding it in a Markov Chain Monte Carlo method, we can highly efficiently sample from the posterior, by replacing both the high-dimensional prior and the expensive forward map. We prove that the proposed methodology converges to the true posterior in the Wasserstein-1 distance and that sampling from the latent space is equivalent to sampling in the high-dimensional space in a weak sense. The method is showcased on two test cases where we perform both state and parameter estimation simultaneously. The approach is shown to be up to two orders of magnitude more accurate than alternative approaches while also being up to two orders of magnitude computationally faster, in multiple test cases, including the important engineering setting of detecting leaks in pipelines.
翻译:在解决巴伊西亚框架内物理应用的反面问题的背景下,我们提出了一个新方法,即Markov 链链(Markov convention Generation Adversarial Neaural Networks (MCGANs),以减轻与解决巴伊西亚推论问题有关的计算成本。GANs为解决巴伊西亚推论问题提供了一个非常合适的框架,有助于解决巴伊西亚推论问题,因为设计这些推论是为了从复杂的高维分布中产生样本。通过培训GAN,从低维深潜层空间进行取样,然后将其嵌入Markov 链(Monte Carlo)方法,我们可以高效地从后方图中进行取样,同时替换前方高维和昂贵的远方图。我们证明,拟议的方法与瓦塞斯坦-1距离的真正后方相交汇,从潜层空间取样相当于从较弱的高度空间取样。我们同时进行状态和参数估测的两个测试案例展示了这种方法。该方法比替代的方法要精确到两个数量级级,同时在多个测试案例中快速测测算,包括测算重要泄漏。