For solving Bayesian inverse problems governed by large-scale forward problems, we present an infinite-dimensional version of the Stein variational gradient descent (iSVGD) method, which has the ability to generate approximate samples from the posteriors efficiently. Specifically, we introduce the concept of the operator-valued kernel and the corresponding function-valued reproducing kernel Hilbert space (RKHS). Through the properties of RKHS, we give an explicit meaning of the infinite-dimensional objects (e.g., the Stein operator) and prove that the infinite-dimensional objects are indeed the limit of finite-dimensional items. Furthermore, by generalizing the change of variables formula, we construct iSVGD with preconditioning operators, yielding more efficient iSVGD. During these generalizations, we introduce a regularity parameter $s\in[0,1]$. Our analysis shows that the intuitive trivial version (i.e., by directly taking finite-dimensional objects as infinite-dimensional items) of iSVGD with preconditioning operators ($s=0$) will yield inaccurate estimates, and the parameter $s$ should be chosen larger than $0$ and smaller than $0.5$. Finally, the proposed algorithms are applied to an inverse problem governed by the Helmholtz equation. Numerical results confirm the correctness of our theoretical findings and demonstrate the potential usefulness of the proposed approach in the posterior sampling of large-scale nonlinear statistical inverse problems.
翻译:为了解决由大规模前方问题所支配的巴伊西亚反问题,我们提出了一个无限版的Stein变异梯度下降(iSVGD)方法,该方法能够高效地从后方采集近似样本。具体地说,我们引入了操作员估价的内核概念和相应的功能价值再生产核心Hilbert空间(RKHS)的常规参数。通过RKHS的特性,我们给出了无限天体(例如Stein操作员)的清晰含义,并证明无限天体确实是有限天体物品的极限。此外,通过对变量公式的变异性公式进行概括化分析,我们与前提条件操作员一起建造了iSVGD(iSVGD),从而能够以更高效的 iSVGD(iVGD) 模型,从而得出不准确的估算值,从而产生更高效的 iSVGDGD。在这些概括中,我们引入了定期参数的参数值参数[0,1,1美元。我们的分析表明,iSVGD与Healimalalalalalal 的精确度分析结果中,最终应该以比美元更小的数值来验证。