The iterated conditional sequential Monte Carlo (i-CSMC) algorithm from Andrieu, Doucet and Holenstein (2010) is an MCMC approach for efficiently sampling from the joint posterior distribution of the $T$ latent states in challenging time-series models, e.g. in non-linear or non-Gaussian state-space models. It is also the main ingredient in particle Gibbs samplers which infer unknown model parameters alongside the latent states. In this work, we first prove that the i-CSMC algorithm suffers from a curse of dimension in the dimension of the states, $D$: it breaks down unless the number of samples ("particles"), $N$, proposed by the algorithm grows exponentially with $D$. Then, we present a novel "local" version of the algorithm which proposes particles using Gaussian random-walk moves that are suitably scaled with $D$. We prove that this iterated random-walk conditional sequential Monte Carlo (i-RW-CSMC) algorithm avoids the curse of dimension: for arbitrary $N$, its acceptance rates and expected squared jumping distance converge to non-trivial limits as $D \to \infty$. If $T = N = 1$, our proposed algorithm reduces to a Metropolis--Hastings or Barker's algorithm with Gaussian random-walk moves and we recover the well known scaling limits for such algorithms.
翻译:Andrieu、Doucet和Holenstein (2010年) 的连续连续不连续的连续连续蒙特卡洛(i-CSMC)算法(i-CSMC) 是一种MCMC方法,用于在具有挑战性的时间序列模型中,例如非线性或非加西南州-空间模型中,以具有挑战性的时间序列模型,从美元潜在国家的联合后座分布中,对美元潜在国家的联合后座分布进行高效取样。这也是粒子Gibbs采样器中的主要成分,在潜伏状态中将未知的模型参数推至相邻。在这项工作中,我们首先证明i-CSMC算法受到州层面范围范围的诅咒,即$D$:除非由算法建议的样本数量(“粒子”)、美元(N$)联合后,该算出一个“本地”的算法,其中提出使用高尔斯随机行移动的粒子,以美元适当按比例缩放宽。 我们证明这种随机行的随机行走随机按顺序(i-RW-CMC)算算算算算算算算法避免了维值的诅咒:任意值: $,其接受率和预期的滚动幅度以美元递增至1美元=美元。