The Bayesian Central Limit Theorem (BCLT) for finite-dimensional models, also known as the Bernstein -- von Mises Theorem, is a primary motivation for the widely-used Laplace approximation. But currently the BCLT is expressed only in terms of total variation (TV) distance and lacks non-asymptotic bounds on the rate of convergence that are readily computable in applications. Likewise, the Laplace approximation is not equipped with non-asymptotic quality guarantees for the vast classes of posteriors for which it is asymptotically valid. To understand its quality and real-problem applicability, we need finite-sample bounds that can be computed for a given model and data set. And to understand the quality of posterior mean and variance estimates, we need bounds on divergences alternative to the TV distance. Our work provides the first closed-form, finite-sample bounds for the quality of the Laplace approximation that do not require log concavity of the posterior or an exponential-family likelihood. We bound not only the TV distance but also (A) the Wasserstein-1 distance, which controls error in a posterior mean estimate, and (B) an integral probability metric that controls the error in a posterior variance estimate. We compute exact constants in our bounds for a variety of standard models, including logistic regression, and numerically investigate the utility of our bounds. And we provide a framework for analysis of more complex models.
翻译:用于定界模型的巴伊西亚中央限制理论(BBLT), 也称为 Bernstein -- von Mises Theorem, 是广泛使用的拉普尔近似的主要动力。 但目前, BCLT 仅以总变异(TV) 距离表示, 并且缺乏在应用中很容易可计算到的对齐率的非非不设防的界限。 同样, Laplace 近距离没有为大类后层模型提供非非非设防质量保障, 而这些后层模型的精确性是无效的。 要理解其质量和真实问题适用性, 我们需要为给定模型和数据集计算出的限值界限。 要理解海边平均和差异估计的质量,我们需要限制与电视距离的差价。 我们的工作为Laplace 近端模型的质量提供了第一个封闭式的、 限设限的界限, 不需要对后端或指数- 家庭可能性的对等值的对等值的精确度框架。 我们不仅为电视的精确度和精确度的精确度(A) 以及精确度(A) 精确度的精确度的精确度的精确度分析, 以及精确度的精确度的精确度的精确度, 和精确度的精确度的深度的深度的比值的比。