Approximate Bayesian computation (ABC) is a popular likelihood-free inference method for models with intractable likelihood functions. As ABC methods usually rely on comparing summary statistics of observed and simulated data, the choice of the statistics is crucial. This choice involves a trade-off between loss of information and dimensionality reduction, and is often determined based on domain knowledge. However, handcrafting and selecting suitable statistics is a laborious task involving multiple trial-and-error steps. In this work, we introduce an active learning method for ABC statistics selection which reduces the domain expert's work considerably. By involving the experts, we are able to handle misspecified models, unlike the existing dimension reduction methods. Moreover, empirical results show better posterior estimates than with existing methods, when the simulation budget is limited.
翻译:近似贝叶斯计算(ABC)是一种对具有难以捉摸的可能性功能的模型的流行的免概率推断方法。由于ABC方法通常依赖于比较观测到的数据和模拟数据的摘要统计,因此统计的选择至关重要。这一选择涉及信息损失与维度减少之间的权衡,而且往往根据域知识加以确定。然而,手工制作和选择合适的统计数据是一项艰巨的任务,涉及多个试验和错误步骤。在这项工作中,我们为ABC统计选择采用一种积极的学习方法,这大大降低了域专家的工作。通过让专家参与,我们能够处理错误指定模型,这与现有的尺寸减少方法不同。此外,在模拟预算有限的情况下,实证结果显示比现有方法的远地点估计数更好。