We study the problem of designing efficient exact MCMC algorithms for sampling from the full posterior distribution of high-dimensional (in the number of time steps and the dimension of the latent space) non-linear non-Gaussian latent dynamical models. Particle Gibbs, also known as conditional sequential Monte Carlo (SMC), constitutes the de facto golden standard to do so, but suffers from degeneracy problems when the dimension of the latent space increases. On the other hand, the routinely employed globally Gaussian-approximated (e.g., extended Kalman filtering) biased solutions are seldom used for this same purpose even though they are more robust than their SMC counterparts. In this article, we show how, by introducing auxiliary observation variables in the model, we can both implement efficient exact Kalman-based samplers for large state-space models, as well as dramatically improve the mixing speed of particle Gibbs algorithms when the dimension of the latent space increases. We demonstrate when and how we can parallelise these auxiliary samplers along the time dimension, resulting in algorithms that scale logarithmically with the number of time steps when implemented on graphics processing units (GPUs). Both algorithms are easily tuned and can be extended to accommodate sophisticated approximation techniques. We demonstrate the improved statistical and computational performance of our auxiliary samplers compared to state-of-the-art alternatives for high-dimensional (in both time and state space) latent dynamical systems.
翻译:暂无翻译