Modern computational advances have enabled easy parallel implementations of Markov chain Monte Carlo (MCMC). However, almost all work in estimating the variance of Monte Carlo averages, including the efficient batch means (BM) estimator, focuses on a single-chain MCMC run. We demonstrate that simply averaging covariance matrix estimators from multiple chains can yield critical underestimates in small Monte Carlo sample sizes, especially for slow-mixing Markov chains. We extend the work of \cite{arg:and:2006} and propose a multivariate replicated batch means (RBM) estimator that utilizes information from parallel chains, thereby correcting for the underestimation. Under weak conditions on the mixing rate of the process, RBM is strongly consistent and exhibits similar large-sample bias and variance to the BM estimator. We also exhibit superior theoretical properties of RBM by showing that the (negative) bias in the RBM estimator is less than the average BM estimator in the presence of positive correlation in MCMC. Consequently, in small runs, the RBM estimator can be dramatically superior and this is demonstrated through a variety of examples.
翻译:暂无翻译