Bayesian methods estimate a measure of uncertainty by using the posterior distribution. One source of difficulty in these methods is the computation of the normalizing constant. Calculating exact posterior is generally intractable and we usually approximate it. Variational Inference (VI) methods approximate the posterior with a distribution usually chosen from a simple family using optimization. The main contribution of this work is described is a set of update rules for natural gradient variational inference with mixture of Gaussians, which can be run independently for each of the mixture components, potentially in parallel.
翻译:Bayesian 方法通过使用后方分布来估计不确定性的度量。这些方法的一个困难来源是正常常数的计算。计算精确的后方常数通常很难,而且我们通常会接近它。变式推论(VI)方法与后方相近,通常通过优化从一个简单的家族中选择的分布法。描述这项工作的主要贡献是一套关于自然梯度变异和高斯人混合物的更新规则,这些规则可以对每种混合物的成分独立运行,可能同时运行。