We consider the problem of estimating complex statistical latent variable models using variational Bayes methods. These methods are used when exact posterior inference is either infeasible or computationally expensive, and they approximate the posterior density with a family of tractable distributions. The parameters of the approximating distribution are estimated using optimisation methods. This article develops a flexible Gaussian mixture variational approximation, where we impose sparsity in the precision matrix of each Gaussian component to reflect the appropriate conditional independence structure in the model. By introducing sparsity in the precision matrix and parameterising it using the Cholesky factor, each Gaussian mixture component becomes parsimonious (with a reduced number of non-zero parameters), while still capturing the dependence in the posterior distribution. Fast estimation methods based on global and local variational boosting moves combined with natural gradients and variance reduction methods are developed. The local boosting moves adjust an existing mixture component, and optimisation is only carried out on a subset of the variational parameters of a new component. The subset is chosen to target improvement of the current approximation in aspects where it is poor. The local boosting moves are fast because only a small number of variational parameters need to be optimised. The efficacy of the approach is illustrated by using simulated and real datasets to estimate generalised linear mixed models and state space models.
翻译:暂无翻译