Recent developments in Markov chain Monte Carlo (MCMC) algorithms allow us to run thousands of chains in parallel almost as quickly as a single chain, using hardware accelerators such as GPUs. While each chain still needs to forget its initial point during a warmup phase, the subsequent sampling phase can be shorter than in classical settings, where we run only a few chains. To determine if the resulting short chains are reliable, we need to assess how close the Markov chains are to their stationary distribution after warmup. The potential scale reduction factor $\widehat R$ is a popular convergence diagnostic but unfortunately can require a long sampling phase to work well. We present a nested design to overcome this challenge and a generalization called \textit{nested} $\widehat R$. This new diagnostic works under conditions similar to $\widehat R$ and completes the workflow for GPU-friendly samplers. In addition, the proposed nesting provides theoretical insights into the utility of $\widehat R$, in both classical and short-chains regimes.
翻译:暂无翻译