Evaluating the variability of posterior estimates is a key aspect of Bayesian model assessment. In this study, we focus on the posterior covariance matrix W, defined through the log-likelihoods of individual observations. Previous studies, notably MacEachern and Peruggia(2002) and Thomas et al.(2018), examined the role of the principal space of W in Bayesian sensitivity analysis. Here, we show that the principal space of W is also central to frequentist evaluation, using the recently proposed Bayesian infinitesimal jackknife (Bayesian IJ) approximation (Giordano and Broderick(2023)) as a key tool. We further clarify the relationship between W and the Fisher kernel, showing that a modified version of the Fisher kernel can be viewed as an approximation to W. Moreover, the matrix W itself can be interpreted as a reproducing kernel, which we refer to as the W-kernel. Based on this connection, we investigate the relation between the W-kernel formulation in the data space and the classical asymptotic formulation in the parameter space. We also introduce the matrix Z, which is effectively dual to W in the sense of PCA; this formulation provides another perspective on the relationship between W and the classical asymptotic theory. In the appendices, we explore approximate bootstrap methods for posterior means and show that projection onto the principal space of W facilitates frequentist evaluation when higher-order terms are included. In addition, we introduce incomplete Cholesky decomposition as an efficient method for computing the principal space of W, and discuss the concept of representative subsets of observations.
翻译:暂无翻译