Power posteriors "robustify" standard Bayesian inference by raising the likelihood to a constant fractional power, effectively downweighting its influence in the calculation of the posterior. Power posteriors have been shown to be more robust to model misspecification than standard posteriors in many settings. Previous work has shown that power posteriors derived from low-dimensional, parametric locally asymptotically normal models are asymptotically normal (Bernstein-von Mises) even under model misspecification. We extend these results to show that the power posterior moments converge to those of the limiting normal distribution suggested by the Bernstein-von Mises theorem. We then use this result to show that the mean of the power posterior, a point estimator, is asymptotically equivalent to the maximum likelihood estimator.
翻译:暂无翻译