The emergence of big data has led to a growing interest in so-called convergence complexity analysis, which is the study of how the convergence rate of a Monte Carlo Markov chain (for an intractable Bayesian posterior distribution) scales as the underlying data set grows in size. Convergence complexity analysis of practical Monte Carlo Markov chains on continuous state spaces is quite challenging, and there have been very few successful analyses of such chains. One fruitful analysis was recently presented by Qin and Hobert (2021b), who studied a Gibbs sampler for a simple Bayesian random effects model. These authors showed that, under regularity conditions, the geometric convergence rate of this Gibbs sampler converges to zero (indicating immediate convergence) as the data set grows in size. It is shown herein that similar behavior is exhibited by Gibbs samplers for more general Bayesian models that possess both random effects and traditional continuous covariates, the so-called mixed models. The analysis employs the Wasserstein-based techniques introduced by Qin and Hobert (2021b).
翻译:海量数据的出现导致人们对所谓的趋同复杂性分析的兴趣日益浓厚,即研究蒙特卡洛·马尔科夫链条(难处理的巴耶西亚后方分布)的趋同率如何随着基本数据集的大小增长而增长。对连续国家空间实际的蒙特卡洛·马尔科夫链条的趋同复杂性分析相当具有挑战性,而且很少成功分析这种链条。秦氏和霍伯特(2021b)最近提交了一项富有成果的分析,他研究了Gibbs取样器,以研究一个简单的巴耶西亚随机效应模型。这些作者显示,在常规条件下,随着数据集的大小增长,Gibbs取样器的几何趋同率将趋同为零(表明即时趋同 ) 。这里显示,Gibs采样器对于具有随机效应和传统连续共通的贝斯模式,即所谓的混合模型,其行为相似。该分析采用了昆氏和霍伯特(2021b)采用的瓦瑟因技术。