Hierarchical Bayesian Poisson regression models (HBPRMs) provide a flexible modeling approach of the relationship between predictors and count response variables. The applications of HBPRMs to large-scale datasets require efficient inference algorithms due to the high computational cost of inferring many model parameters based on random sampling. Although Markov Chain Monte Carlo (MCMC) algorithms have been widely used for Bayesian inference, sampling using this class of algorithms is time-consuming for applications with large-scale data and time-sensitive decision-making, partially due to the non-conjugacy of many models. To overcome this limitation, this research develops an approximate Gibbs sampler (AGS) to efficiently learn the HBPRMs while maintaining the inference accuracy. In the proposed sampler, the data likelihood is approximated with Gaussian distribution such that the conditional posterior of the coefficients has a closed-form solution. Numerical experiments using real and synthetic datasets with small and large counts demonstrate the superior performance of AGS in comparison to the state-of-the-art sampling algorithm, especially for large datasets.
翻译:尽管在巴伊西亚的推断中广泛使用了马可夫链子蒙特卡洛(MCMC)算法,但使用这一类算法的取样在应用大规模数据和时间敏感决策方面耗费时间,部分原因是许多模型不兼容。为克服这一限制,这项研究开发了近似Gibbs采样器(AGS),以有效学习HBPRM,同时保持推断准确性。在拟议的取样器中,数据可能性与高尔斯分布相近,因此系数的附带条件的外表有一个封闭式的解决方案。使用实际和合成数据集进行的规模大小都显示AGS相对于州一级抽样算法的优异性。