Variational Bayes (VB) is a popular scalable alternative to Markov chain Monte Carlo for Bayesian inference. We study a mean-field spike and slab VB approximation of widely used Bayesian model selection priors in sparse high-dimensional logistic regression. We provide non-asymptotic theoretical guarantees for the VB posterior in both $\ell_2$ and prediction loss for a sparse truth, giving optimal (minimax) convergence rates. Since the VB algorithm does not depend on the unknown truth to achieve optimality, our results shed light on effective prior choices. We confirm the improved performance of our VB algorithm over common sparse VB approaches in a numerical study.
翻译:变异贝耶斯(VB)是Markov 链条蒙特卡洛(Monte Carlo)为贝耶斯人推断而流行的一种可缩放的替代物。我们研究了在低度高维后勤回归中广泛使用的巴伊西亚模型选择前端的中位钉和板块VB近似值。我们为VB后方提供了非零位理论保证,包括$@ ell_ 2美元和对稀薄真理的预测损失,提供了最佳(最小)趋同率。由于VB算法并不取决于未知的真理以实现最佳性,我们的结果揭示了以前的有效选择。我们确认,在一项数字研究中,我们VB算法相对于常见的稀有VB方法的改进性表现。