Identifying a low-dimensional informed parameter subspace offers a viable path to alleviating the dimensionality challenge in the sampled-based solution to large-scale Bayesian inverse problems. This paper introduces a novel gradient-based dimension reduction method in which the informed subspace does not depend on the data. This permits an online-offline computational strategy where the expensive low-dimensional structure of the problem is detected in an offline phase, meaning before observing the data. This strategy is particularly relevant for multiple inversion problems as the same informed subspace can be reused. The proposed approach allows controlling the approximation error (in expectation over the data) of the posterior distribution. We also present sampling strategies that exploit the informed subspace to draw efficiently samples from the exact posterior distribution. The method is successfully illustrated on two numerical examples: a PDE-based inverse problem with a Gaussian process prior and a tomography problem with Poisson data and a Besov-$\mathcal{B}^2_{11}$ prior.
翻译:确定一个低维知情参数子空间为减轻基于抽样的贝叶西亚反向问题解决方案中的维度挑战提供了一条可行的路径。 本文介绍了一种新的基于梯度的减少维度方法, 该方法使知情的子空间不依赖于数据。 这允许采用在线离线计算策略, 在离线阶段发现昂贵的低维问题结构, 意思是在观察数据之前。 这个策略与同一知情的子空间可以再利用的多重反向问题特别相关。 提议的方法允许控制远端分布的近似错误( 对数据的期望 ) 。 我们还介绍了利用知情的子空间从精确的远端分布中高效提取样本的取样策略。 这个方法在两个数字实例上得到了成功演示: 基于高斯进程前的PDE反向问题, 与Poisson数据及先前的Besov-$\mathcal{B ⁇ 2 ⁇ 11} 。