A staple of Bayesian model comparison and hypothesis testing, Bayes factors are often used to quantify the relative predictive performance of two rival hypotheses. The computation of Bayes factors can be challenging, however, and this has contributed to the popularity of convenient approximations such as the BIC. Unfortunately, these approximations can fail in the case of informed prior distributions. Here we address this problem by outlining an approximation to informed Bayes factors for a focal parameter $\theta$. The approximation is computationally simple and requires only the maximum likelihood estimate $\hat\theta$ and its standard error. The approximation uses an estimated likelihood of $\theta$ and assumes that the posterior distribution for $\theta$ is unaffected by the choice of prior distribution for the nuisance parameters. The resulting Bayes factor for the null hypothesis $\mathcal{H}_0: \theta = \theta_0$ versus the alternative hypothesis $\mathcal{H}_1: \theta \sim g(\theta)$ is then easily obtained using the Savage--Dickey density ratio. Three real-data examples highlight the speed and closeness of the approximation compared to bridge sampling and Laplace's method. The proposed approximation facilitates Bayesian reanalyses of standard frequentist results, encourages application of Bayesian tests with informed priors, and alleviates the computational challenges that often frustrate both Bayesian sensitivity analyses and Bayes factor design analyses. The approximation is shown to suffer under small sample sizes and when the posterior distribution of the focal parameter is substantially influenced by the prior distributions on the nuisance parameters. The proposed methodology may also be used to approximate the posterior distribution for $\theta$ under $\mathcal{H}_1$.
翻译: