In this paper, we explore adaptive inference based on variational Bayes. Although several studies have been conducted to analyze the contraction properties of variational posteriors, there is still a lack of a general and computationally tractable variational Bayes method that performs adaptive inference. To fill this gap, we propose a novel adaptive variational Bayes framework, which can operate on a collection of models. The proposed framework first computes a variational posterior over each individual model separately and then combines them with certain weights to produce a variational posterior over the entire model. It turns out that this combined variational posterior is the closest member to the posterior over the entire model in a predefined family of approximating distributions. We show that the adaptive variational Bayes attains optimal contraction rates adaptively under very general conditions. In addition, we provide a methodology to maintain the tractability and adaptive optimality of the adaptive variational Bayes even in the presence of an enormous number of individual models, such as sparse models. We apply the general results to several examples, including deep learning and sparse factor models, and derive new and adaptive inference results. Moreover, we consider the use of quasi-likelihoods in our framework. We formulate theoretical conditions on quasi-likelihoods to ensure adaptive concentration and discuss specific applications to stochastic block models and nonparametric regression with sub-Gaussian errors.
翻译:在本文中,我们探索了基于变异贝ys的适应性推断。 尽管已经进行了几项研究来分析变异后背体的收缩特性, 但仍缺乏一种一般的和计算上可移动的变异贝ys方法, 以进行适应性推导。 为了填补这一差距, 我们提出一个新的适应性变异贝ys框架, 该框架可以在一系列模型的基础上运作。 拟议的框架首先对每个模型分别进行变异后背体的计算, 然后将其与某些重量结合起来, 以产生整个模型的变异后背体。 事实证明, 在一种预先定义的近似分布式组合中, 这个变异后背体是后背体最接近整个模型的成员。 我们显示, 适应性变异贝斯在非常一般的条件下, 达到最佳的收缩率。 此外, 我们提供了一种方法来保持适应性变异性贝斯模型的可调适度和适应性最佳性, 即使存在大量的个人模型, 如稀释模型。 我们把一般结果应用于几个例子, 包括深度学习和稀变异性因素模型, 以及准因素模型的模型 。