Variational Bayes (VB) is a popular tool for Bayesian inference in statistical modeling. Recently, some VB algorithms are proposed to handle intractable likelihoods with applications such as approximate Bayesian computation. In this paper, we propose several unbiased estimators based on multilevel Monte Carlo (MLMC) for the gradient of Kullback-Leibler divergence between the posterior distribution and the variational distribution when the likelihood is intractable, but can be estimated unbiasedly. The new VB algorithm differs from the VB algorithms in the literature which usually render biased gradient estimators. Moreover, we incorporate randomized quasi-Monte Carlo (RQMC) sampling within the MLMC-based gradient estimators, which was known to provide a favorable rate of convergence in numerical integration. Theoretical guarantees for RQMC are provided in this new setting. Numerical experiments show that using RQMC in MLMC greatly speeds up the VB algorithm, and finds a better parameter value than some existing competitors do.
翻译:在统计模型中,变异贝耶斯(VB)是一种流行的工具,可以用来推断贝耶斯人的推算。最近,提议了一些VB算法,以处理诸如近似巴耶斯计算等应用的棘手可能性。在本文中,我们提议了数种基于多层次蒙特卡洛(MLMC)的不偏颇估计器,用于后背分布和变异分布之间的梯度差异,如果可能性是棘手的,但可以不偏袒地估计。新的VB算法与文献中通常产生偏向梯度估测器的VB算法不同。此外,我们把随机化的准蒙卡洛(RQMC)取样器纳入基于MLMC的梯度估计器(RQMC)的取样中,这在数字整合中可以提供优异的趋同率。在这个新环境下为RQMC提供了理论保证。数字实验显示,在MLMC中使用RQMC大大加快了VB算法,并发现比一些现有竞争者更好的参数值。