In this paper, we propose an infinite-dimensional version of the Stein variational gradient descent (iSVGD) method for solving Bayesian inverse problems. The method can generate approximate samples from posteriors efficiently. Based on the concepts of operator-valued kernels and vector-valued reproducing kernel Hilbert spaces, a rigorous definition is given for the infinite-dimensional objects, e.g., the Stein operator, which are proved to be the limit of finite-dimensional ones. Moreover, a more efficient iSVGD with preconditioning operators is constructed by generalizing the change of variables formula and introducing a regularity parameter. The proposed algorithms are applied to an inverse problem of the steady state Darcy flow equation. Numerical results confirm our theoretical findings and demonstrate the potential applications of the proposed approach in the posterior sampling of large-scale nonlinear statistical inverse problems.
翻译:在本文中,我们提出了一个解决巴伊西亚反向问题的无穷维的斯坦因变梯度下降法(iSVGD)的无限版本。该方法可以有效地从后方生成近似样本。根据操作员估价的内核和矢量估价的再生产内核Hilbert空间的概念,为无限维天体(例如Stein操作员)给出了严格的定义,这些天体被证明是有限维天体的极限。此外,通过概括变量公式的变化和引入常规参数,构建了一个效率更高的iSVGD操作员的iSVGD。提议的算法被应用于稳定的达西流动方程式的逆向问题。数字结果证实了我们的理论结论,并展示了拟议方法在大规模非线性统计反性问题的后方取样中的潜在应用。