In statistical analysis, Monte Carlo (MC) stands as a classical numerical integration method. When encountering challenging sample problem, Markov chain Monte Carlo (MCMC) is a commonly employed method. However, the MCMC estimator is biased after a fixed number of iterations. Unbiased MCMC, an advancement achieved through coupling techniques, addresses this bias issue in MCMC. It allows us to run many short chains in parallel. Quasi-Monte Carlo (QMC), known for its high order of convergence, is an alternative of MC. By incorporating the idea of QMC into MCMC, Markov chain quasi-Monte Carlo (MCQMC) effectively reduces the variance of MCMC, especially in Gibbs samplers. This work presents a novel approach that integrates unbiased MCMC with MCQMC, called as an unbiased MCQMC method. This method renders unbiased estimators while improving the rate of convergence significantly. Numerical experiments demonstrate that for Gibbs sampling, unbiased MCQMC with a sample size of $N$ yields a faster root mean square error (RMSE) rate than the \(O(N^{-1/2})\) rate of unbiased MCMC, toward an RMSE rate of \(O(N^{-1})\) for low-dimensional problems. Surprisingly, in a challenging problem of 1049-dimensional P\'olya Gamma Gibbs sampler, the RMSE can still be reduced by several times for moderate sample sizes. In the setting of parallelization, unbiased MCQMC also performs better than unbiased MCMC, even running with short chains.
翻译:暂无翻译