By formulating the inverse problem of partial differential equations (PDEs) as a statistical inference problem, the Bayesian approach provides a general framework for quantifying uncertainties. In the inverse problem of PDEs, parameters are defined on an infinite-dimensional function space, and the PDEs induce a computationally intensive likelihood function. Additionally, sparse data tends to lead to a multi-modal posterior. These features make it difficult to apply existing sequential Monte Carlo (SMC) algorithms. To overcome these difficulties, we propose new conditions for the likelihood functions, construct a Gaussian mixture based preconditioned Crank-Nicolson transition kernel, and demonstrate the universal approximation property of the infinite-dimensional Gaussian mixture probability measure. By combining these three novel tools, we propose a new SMC algorithm, named SMC-GM. For this new algorithm, we obtain a convergence theorem that allows Gaussian priors, illustrating that the sequential particle filter actually reproduces the true posterior distribution. Furthermore, the proposed new algorithm is rigorously defined on the infinite-dimensional function space, naturally exhibiting the discretization-invariant property. Numerical experiments demonstrate that the new approach has a strong ability to probe the multi-modality of the posterior, significantly reduces the computational burden, and numerically exhibits the discretization-invariant property (important for large-scale problems).
翻译:暂无翻译