Bayesian computation plays an important role in modern machine learning and statistics to reason about uncertainty. A key computational challenge in Bayesian inference is to develop efficient techniques to approximate, or draw samples from posterior distributions. Stein variational gradient decent (SVGD) has been shown to be a powerful approximate inference algorithm for this issue. However, the vanilla SVGD requires calculating the gradient of the target density and cannot be applied when the gradient is unavailable or too expensive to evaluate. In this paper we explore one way to address this challenge by the construction of a local surrogate for the target distribution in which the gradient can be obtained in a much more computationally feasible manner. More specifically, we approximate the forward model using a deep neural network (DNN) which is trained on a carefully chosen training set, which also determines the quality of the surrogate. To this end, we propose a general adaptation procedure to refine the local approximation online without destroying the convergence of the resulting SVGD. This significantly reduces the computational cost of SVGD and leads to a suite of algorithms that are straightforward to implement. The new algorithm is illustrated on a set of challenging Bayesian inverse problems, and numerical experiments demonstrate a clear improvement in performance and applicability of standard SVGD.
翻译:在现代机器学习和统计学中,贝叶斯计算在现代机器学习和统计学中起着重要作用,以了解不确定性。贝伊斯推论中的一个关键计算难题是开发高效技术,以近似或从后天分布中提取样本。斯坦变异梯度体面(SVGD)被证明是这一问题的一个强大的近似推算算算算法。然而,香草SVGD要求计算目标密度的梯度,当梯度不存在或太昂贵无法评估时,不能应用。在本文中,我们探索了应对这一挑战的一种方法,即为目标分布建造一个本地替代装置,以更可行的计算方式获取梯度。更具体地说,我们利用深神经网络(DNNN)来估计前方模型,该模型经过精心选择的训练,也决定了该套培训的质量。为此,我们提议了一个一般的调整程序,在不破坏SVGD的趋同作用的情况下改进本地近似。这大大降低了SVGD的计算成本,并导致一套直截可执行的算法。新的算法在SGAV上展示了一套具有挑战性的改进性的标准。