Hamiltonian Monte Carlo (HMC) is an efficient method of simulating smooth distributions and has motivated the widely used No-U-turn Sampler (NUTS) and software Stan. We build on NUTS and the technique of "unbiased sampling" to design HMC algorithms that produce perfect simulation of general continuous distributions that are amenable to HMC. Our methods enable separation of Markov chain Monte Carlo convergence error from experimental error, and thereby provide much more powerful MCMC convergence diagnostics than current state-of-the-art summary statistics which confound these two errors. Objective comparison of different MCMC algorithms is provided by the number of derivative evaluations per perfect sample point. We demonstrate the methodology with applications to normal, $t$ and normal mixture distributions up to 100 dimensions, and a 12-dimensional Bayesian Lasso regression. HMC runs effectively with a goal of 20 to 30 points per trajectory. Numbers of derivative evaluations per perfect sample point range from 390 for a univariate normal distribution to 12,000 for a 100-dimensional mixture of two normal distributions with modes separated by six standard deviations, and 22,000 for a 100-dimensional $t$-distribution with four degrees of freedom.
翻译:我们以NUTS和“无偏见取样”技术为基础,设计了HMC算法,对一般连续分布进行符合HMC要求的完美模拟。我们的方法使Markov连锁的Monton Monte Carlo的趋同错误从实验错误中分离出来,从而提供了比目前最先进的汇总统计方法更强大的MCMC趋同诊断,这些统计与这两个错误相混淆。不同MCMC算法的客观比较由每个完美取样点的衍生物评价数提供。我们用正常、美元和正常混合物分布至100维的公式和正常混合物分布以及12维贝叶色拉索回归法来演示方法。HMC的有效运行目标是每个轨道20至30个点。每个完美取样点的衍生物评价数字从390个非象素正常分布到12 000个标准偏差的100维度正常分配两种正常混合物的100维度到4度自由度的22 000美元不等。