A key challenge for modern Bayesian statistics is how to perform scalable inference of posterior distributions. To address this challenge, variational Bayes (VB) methods have emerged as a popular alternative to the classical Markov chain Monte Carlo (MCMC) methods. VB methods tend to be faster while achieving comparable predictive performance. However, there are few theoretical results around VB. In this paper, we establish frequentist consistency and asymptotic normality of VB methods. Specifically, we connect VB methods to point estimates based on variational approximations, called frequentist variational approximations, and we use the connection to prove a variational Bernstein-von Mises theorem. The theorem leverages the theoretical characterizations of frequentist variational approximations to understand asymptotic properties of VB. In summary, we prove that (1) the VB posterior converges to the Kullback-Leibler (KL) minimizer of a normal distribution, centered at the truth and (2) the corresponding variational expectation of the parameter is consistent and asymptotically normal. As applications of the theorem, we derive asymptotic properties of VB posteriors in Bayesian mixture models, Bayesian generalized linear mixed models, and Bayesian stochastic block models. We conduct a simulation study to illustrate these theoretical results.
翻译:现代Bayesian 统计数据的一个关键挑战是如何对后传分布进行可缩放的推论。 为了应对这一挑战,变异贝耶斯(VB)方法已成为古典马可夫连锁蒙特卡洛(MCMC)方法的流行替代物。 VB方法往往更快,同时取得可比较的预测性能。然而,VB周围几乎没有什么理论结果。在本文中,我们建立了VB方法的常年一致性和无症状的正常性。具体地说,我们将VB方法连接到基于变异近点的估算点上,称为经常变异近点的近点,我们使用这种连接来证明一个变异的Bernstein-von Mises 理论。该理论利用经常变近似的理论特征来理解VB的随机特征。 简而言之,我们证明:(1) VB 远地点与 Kullback-Leiber (KL) 方法的常态最小化。我们把正常分布的最小化,以真理为中心,以及(2)对参数的相应变异性预期是一致的Bymotototomodal modial modial modes