Generalized linear mixed models (GLMMs) are often used for analyzing correlated non-Gaussian data. The likelihood function in a GLMM is available only as a high dimensional integral, and thus closed-form inference and prediction are not possible for GLMMs. Since the likelihood is not available in a closed-form, the associated posterior densities in Bayesian GLMMs are also intractable. Generally, Markov chain Monte Carlo (MCMC) algorithms are used for conditional simulation in GLMMs and exploring these posterior densities. In this article, we present different state of the art MCMC algorithms for fitting GLMMs. These MCMC algorithms include efficient data augmentation strategies, as well as diffusions based and Hamiltonian dynamics based methods. The Langevin and Hamiltonian Monte Carlo methods presented here are applicable to any GLMMs, and are illustrated using three most popular GLMMs, namely, the logistic and probit GLMMs for binomial data and the Poisson-log GLMM for count data. We also present efficient data augmentation algorithms for probit and logistic GLMMs. Some of these algorithms are compared using a numerical example.
翻译:普通线性混合模型(GLMMs)通常用于分析相关非Gausian数据,GLMM中的可能性功能仅作为高维整体功能提供,因此GLMMM不可能进行封闭式的推断和预测。由于无法以封闭形式提供这种可能性,因此,Bayesian GLMMMs中相关的后端密度也非常棘手。一般而言,Markov连锁 Monte Carlo(MCMC)算法用于GLMMs的有条件模拟和探索这些后方密度。在本篇文章中,我们介绍了适合GLMMs的艺术MC算法的不同状态。这些MCMM算法包括有效的数据扩增战略,以及基于扩散和汉密尔顿动力法。这里介绍的Langevin和Hamilton Monte Carlo方法适用于任何GLMMMs,并用三种最受欢迎的GLMMMM算法加以说明,即二进制数据的后勤和probit GLMMMMM和Poisson GMMMM计算数据的Ps。我们还展示了一种高效率的数值。