Bayesian likelihood-free inference, which is used to perform Bayesian inference when the likelihood is intractable, enjoys an increasing number of important scientific applications. However, many aspects of a Bayesian analysis become more challenging in the likelihood-free setting. One example of this is prior-data conflict checking, where the goal is to assess whether the information in the data and the prior are inconsistent. Conflicts of this kind are important to detect, since they may reveal problems in an investigator's understanding of what are relevant values of the parameters, and can result in sensitivity of Bayesian inferences to the prior. Here we consider methods for prior-data conflict checking which are applicable regardless of whether the likelihood is tractable or not. In constructing our checks, we consider checking statistics based on prior-to-posterior Kullback-Leibler divergences. The checks are implemented using mixture approximations to the posterior distribution and closed-form approximations to Kullback-Leibler divergences for mixtures, which make Monte Carlo approximation of reference distributions for calibration computationally feasible. When prior-data conflicts occur, it is useful to consider weakly informative prior specifications in alternative analyses as part of a sensitivity analysis. As a main application of our methodology, we develop a technique for searching for weakly informative priors in likelihood-free inference, where the notion of a weakly informative prior is formalized using prior-data conflict checks. The methods are demonstrated in three examples.
翻译:贝叶斯分析的许多方面在无可能性的情况下变得更具挑战性。这方面的一个例子就是先前的数据冲突检查,目的是评估数据中和之前的数据是否不一致。 此类冲突对于检测非常重要,因为它们可能揭示了调查员对参数相关值的理解存在问题,并可能导致贝叶斯推论对先前的敏感度。在这里,我们考虑的是先前的数据冲突检查方法,无论可能性是否易行都适用。在构建我们的检查时,我们考虑的是基于事先到库尔回-利伯尔差异的数据检查统计数据。进行这种检查时使用了与数据后传分布的混合近似值,而后传-利伯尔差异的闭式近似值则对混合物的 Kullback-利伯尔差异值的混合度,这些差异使蒙特卡洛对参考分布的近比对前述的敏感度变得可行。在先前的数据冲突发生时,我们考虑的是先前信息信息性分析中一个薄弱的可能性分析方法。在先前的信息性分析中,在先读信息性分析中,先读过的信息性分析的主要方法是一种薄弱的可能性分析。