In this paper, we propose a new stochastic optimization algorithm for Bayesian inference based on multilevel Monte Carlo (MLMC) methods. In Bayesian statistics, biased estimators of the model evidence have been often used as stochastic objectives because the existing debiasing techniques are computationally costly to apply. To overcome this issue, we apply an MLMC sampling technique to construct low-variance unbiased estimators both for the model evidence and its gradient. In the theoretical analysis, we show that the computational cost required for our proposed MLMC estimator to estimate the model evidence or its gradient with a given accuracy is an order of magnitude smaller than those of the previously known estimators. Our numerical experiments confirm considerable computational savings compared to the conventional estimators. Combining our MLMC estimator with gradient-based stochastic optimization results in a new scalable, efficient, debiased inference algorithm for Bayesian statistical models.
翻译:在本文中,我们根据多层次蒙特卡洛(MLMC)方法为贝耶斯人的推论提出了一种新的随机优化算法。在巴耶斯统计中,模型证据的偏差估计器经常被用作随机目标,因为现有的偏差技术在计算上成本很高。为了解决这一问题,我们采用了MLMC抽样技术来为模型证据及其梯度构建低差异、无偏差的测算器。在理论分析中,我们显示,我们提议的MLMC估计器估算模型证据或其梯度所需的计算成本比先前已知估计器的精确度要小得多。我们的数字实验证实了与常规估测器相比在计算方面的大量节省。我们MLMC的测算器与基于梯度的测算器的测算结果相结合,为Bayes统计模型提供了一个新的可缩放、高效、分差的推断算法。