This note attempts to revisit the classical results on Laplace approximation in a modern non-asymptotic and dimension free form. Such an extension is motivated by applications to high dimensional statistical and optimization problems. The established results provide explicit non-asymptotic bounds on the quality of a Gaussian approximation of the posterior distribution in total variation distance in terms of the so called \emph{effective dimension} \( \dimL \). This value is defined as interplay between information contained in the data and in the prior distribution. In the contrary to prominent Bernstein - von Mises results, the impact of the prior is not negligible and it allows to keep the effective dimension small or moderate even if the true parameter dimension is huge or infinite. We also address the issue of using a Gaussian approximation with inexact parameters with the focus on replacing the Maximum a Posteriori (MAP) value by the posterior mean and design the algorithm of Bayesian optimization based on Laplace iterations. The results are specified to the case of nonlinear regression.
翻译:本说明试图以现代非无线和无维的形式重新审视拉皮尔近似法的经典结果。 这种扩展是由高维统计和优化问题的应用驱动的。 既定结果提供了明确的非无线界限, 显示高斯近似法质量, 以所谓的 emph{ 有效维度\\ (\\ dimL\\) 来显示远差来显示后端分布。 这个数值被定义为数据所含信息与先前分布中所含信息的相互作用。 与突出的伯恩斯坦- von Misses 结果相反, 先前的结果的影响不容忽略, 并且即使真正的参数维度巨大或无限, 也允许将有效维度保持小或中度。 我们还处理了使用高斯近似法的直线度, 重点是用后端平均值取代后端值, 并设计基于 Laplace 的Bayesian 优化算法。 其结果被具体描述为非线性回归的情况 。