This work proposes ''jointly amortized neural approximation'' (JANA) of intractable likelihood functions and posterior densities arising in Bayesian surrogate modeling and simulation-based inference. We train three complementary networks in an end-to-end fashion: 1) a summary network to compress individual data points, sets, or time series into informative embedding vectors; 2) a posterior network to learn an amortized approximate posterior; and 3) a likelihood network to learn an amortized approximate likelihood. Their interaction opens a new route to amortized marginal likelihood and posterior predictive estimation -- two important ingredients of Bayesian workflows that are often too expensive for standard methods. We benchmark the fidelity of JANA on a variety of simulation models against state-of-the-art Bayesian methods and propose a powerful and interpretable diagnostic for joint calibration. In addition, we investigate the ability of recurrent likelihood networks to emulate complex time series models without resorting to hand-crafted summary statistics.
翻译:这项工作提出了“联合摊销神经近似”的“联合神经近似值”和“联合摊销近似值”的“联合摊销近似值 ” (JANA ) 。 我们以端对端方式培训三个互补网络:1) 压缩单个数据点、数据集或时间序列的简要网络,将其压缩成信息化嵌入矢量;2) 学习摊销近似近近近似值的后继网络;3) 学习摊销近似可能性的网络。它们的互动为摊销边际可能性和后继预测性估计开辟了一条新途径,这是巴伊西亚工作流程的两个重要组成部分,对于标准方法来说往往费用太高。我们用各种模拟模型来比照贝伊西亚州最新技术的准确性,并提出联合校准的有力和可解释的诊断方法。此外,我们还调查经常存在的可能性网络在不采用手制的简要统计数据的情况下模仿复杂时间序列模型的能力。