The emergence of big data has led to a growing interest in so-called convergence complexity analysis, which is the study of how the convergence rate of a Monte Carlo Markov chain (for an intractable Bayesian posterior distribution) scales as the underlying data set grows in size. Convergence complexity analysis of practical Monte Carlo Markov chains on continuous state spaces is quite challenging, and there have been very few successful analyses of such chains. One fruitful analysis was recently presented by Qin and Hobert (2021b), who studied a Gibbs sampler for a simple Bayesian random effects model. These authors showed that, under regularity conditions, the geometric convergence rate of this Gibbs sampler converges to zero as the data set grows in size. It is shown herein that similar behavior is exhibited by Gibbs samplers for more general Bayesian models that possess both random effects and traditional continuous covariates, the so-called mixed models. The analysis employs the Wasserstein-based techniques introduced by Qin and Hobert (2021b).
翻译:海量数据的出现使人们对所谓的趋同复杂性分析越来越感兴趣,即研究蒙特-卡洛-马尔科夫链条(难处理的巴耶西亚后方分布)的趋同率如何随着基本数据集的大小增长而增长。对连续国家空间上实际的蒙特-卡洛-马尔科夫链条的趋同复杂性分析相当具有挑战性,对这种链条的成功分析很少。秦和霍伯特(2021b)最近提交了一项富有成果的分析,他研究了Gibbs取样器,以研究一个简单的巴耶西亚随机效应模型。这些作者显示,在常规条件下,随着数据集规模的扩大,这个Gibbs取样器的几何级汇合率会达到零。这里显示,Gibs取样器对于具有随机效应和传统连续共变模式的更一般性的巴伊西亚模型,即所谓的混合模型,也表现出类似的行为。该分析采用了Queserstein和霍伯特(2021b)采用的瓦森基技术。