We provide a deepened study of autocorrelations in Neural Markov Chain Monte Carlo (NMCMC) simulations, a version of the traditional Metropolis algorithm which employs neural networks to provide independent proposals. We illustrate our ideas using the two-dimensional Ising model. We discuss several estimates of autocorrelation times in the context of NMCMC, some inspired by analytical results derived for the Metropolized Independent Sampler (MIS). We check their reliability by estimating them on a small system where analytical results can also be obtained. Based on the analytical results for MIS we propose a new loss function and study its impact on the autocorelation times. Although, this function's performance is a bit inferior to the traditional Kullback-Leibler divergence, it offers two training algorithms which in some situations may be beneficial. By studying a small, $4 \times 4$, system we gain access to the dynamics of the training process which we visualize using several observables. Furthermore, we quantitatively investigate the impact of imposing global discrete symmetries of the system in the neural network training process on the autocorrelation times. Eventually, we propose a scheme which incorporates partial heat-bath updates which considerably improves the quality of the training. The impact of the above enhancements is discussed for a $16 \times 16$ spin system. The summary of our findings may serve as a guidance to the implementation of Neural Markov Chain Monte Carlo simulations for more complicated models.
翻译:我们深入研究Neural Markov 链链蒙特卡洛(NMCMC)的自动关系模拟(NMCMC),这是使用神经网络提供独立建议的传统大都会算法的版本,我们用二维Ising模型来说明我们的想法。我们讨论在NMC模型范围内对自动关系时间的若干估计,其中一些根据为MMC的元化独立采样器(MIS)得出的分析结果。我们通过在一个小系统中进行估计来检查其可靠性,在这个小系统中也可以取得分析结果。根据对MIS的分析结果,我们提议一个新的损失函数,并研究其对自动计数时代的影响。虽然这个函数的性能比传统的Kullback-Leiber差异低一点,但是它提供了两种培训算法,在某些情况下可能是有益的。我们通过研究一个小的,4\time 4美元的系统,我们就可以利用若干可观察的图像来了解培训过程的动态。此外,我们从数量上研究在16个网络中设置全球离散的系统对自动计数值分析结果的影响,并研究其对自动计数数数数数数调时段数时间的影响。虽然这个函数的表现比传统的 Krebackbackbackregraphreal roalalalalalalalalalaldealalalalalalalalalal magragragragradumeslup thesluplupslups lauts las lass lautsssss las las lautsuptoalal lauts lauts lautaltimesups lauts lax lautal lautal lautaldsupssssssupdslupdslupdsaldsaldsalsalsalsalsalsalsal lautsaldsalsalsalsalsalsalsalsalalalalalsalsalsalalalalalalsssssssssssssss lautsmasssssalalsssssssalsssalsalsalsalssalsupsalsalsalsals