We use the theory of normal variance-mean mixtures to derive a data augmentation scheme for models that include gamma functions. Our methodology applies to many situations in statistics and machine learning, including Multinomial-Dirichlet distributions, Negative binomial regression, Poisson-Gamma hierarchical models, Extreme value models, to name but a few. All of those models include a gamma function which does not admit a natural conjugate prior distribution providing a significant challenge to inference and prediction. To provide a data augmentation strategy, we construct and develop the theory of the class of P\'olya Inverse Gamma distributions. This allows scalable EM and MCMC algorithms to be developed. We illustrate our methodology on a number of examples, including gamma shape inference, negative binomial regression and Dirichlet allocation. Finally, we conclude with directions for future research.
翻译:我们用正常差异平均值混合物理论来为包含伽马函数的模型制定数据增强计划。 我们的方法适用于统计和机器学习的许多情况,包括多分子-二里赫特分布、负二进制回归、Poisson-Gamma等级模型、极端价值模型等等。 所有这些模型都包含伽马函数,其中不包含自然共和先前分布,给推断和预测带来重大挑战。为了提供数据增强战略,我们构建和发展P\'olya Inverse Gamma分布等级理论。这可以开发可缩放的EM和MCMC算法。我们用许多例子来说明我们的方法,包括伽马形状推断、负双成回归和Drichlet分配。最后,我们用未来研究的方向来结束。