This work proposes ''jointly amortized neural approximation'' (JANA) of intractable likelihood functions and posterior densities arising in Bayesian surrogate modeling and simulation-based inference. We train three complementary networks in an end-to-end fashion: 1) a summary network to compress individual data points, sets, or time series into informative embedding vectors; 2) a posterior network to learn an amortized approximate posterior; and 3) a likelihood network to learn an amortized approximate likelihood. Their interaction opens a new route to amortized marginal likelihood and posterior predictive estimation -- two important ingredients of Bayesian workflows that are often too expensive for standard methods. We benchmark the fidelity of JANA on a variety of simulation models against state-of-the-art Bayesian methods and propose a powerful and interpretable diagnostic for joint calibration. In addition, we investigate the ability of recurrent likelihood networks to emulate complex time series models without resorting to hand-crafted summary statistics.
翻译:这项工作提出了一种“共同摊销的神经函数近似”(JANA),用于处理贝叶斯代理建模和基于模拟的推理中出现的难以计算的似然函数和后验密度。我们端到端地训练三个互补的网络:1)一个摘要网络将单个数据点、集合或时间序列压缩成信息丰富的嵌入向量;2)一个后验网络学习摊销近似后验;和3)一个似然网络学习摊销近似似然。它们的相互作用开辟了一条新的路径,用于分摊较昂贵的贝叶斯工作流程中的边缘似然和后验预测估计。我们使用各种模拟模型来比较JANA的准确性,并提出了一种强大且可解释的联合校准诊断。此外,我们研究了递归似然网络在不使用手工制作的摘要统计数据的情况下模拟复杂时间序列模型的能力。