In the absence of explicit or tractable likelihoods, Bayesians often resort to approximate Bayesian computation (ABC) for inference. Our work bridges ABC with deep neural implicit samplers based on generative adversarial networks (GANs) and adversarial variational Bayes. Both ABC and GANs compare aspects of observed and fake data to simulate from posteriors and likelihoods, respectively. We develop a Bayesian GAN (B-GAN) sampler that directly targets the posterior by solving an adversarial optimization problem. B-GAN is driven by a deterministic mapping learned on the ABC reference by conditional GANs. Once the mapping has been trained, iid posterior samples are obtained by filtering noise at a negligible additional cost. We propose two post-processing local refinements using (1) data-driven proposals with importance reweighing, and (2) variational Bayes. We support our findings with frequentist-Bayesian results, showing that the typical total variation distance between the true and approximate posteriors converges to zero for certain neural network generators and discriminators. Our findings on simulated data show highly competitive performance relative to some of the most recent likelihood-free posterior simulators.
翻译:在没有明确或可移动可能性的情况下,贝亚人经常采用近似贝雅斯计算法(ABC)来推断。我们的工作桥梁ABC与基于基因对抗网络(GANs)和对抗性变异贝贝的深神经隐含样本。ABC和GAN都分别将观测和假数据的各个方面进行比较,以模拟远地点和可能性来进行模拟。我们开发了巴伊西亚GAN(B-GAN)取样器,通过解决对称优化问题,直接针对后台。B-GAN是由有条件GANs在ABC参考上学习的确定性绘图驱动的。一旦经过培训,就以微不足道的额外费用过滤噪音获取iid 后后,我们建议使用(1) 数据驱动的、重要回旋的建议和(2) 变异湾,进行两次后处理后的地方改进。我们支持我们的调查结果,经常使用Bayesian-BAyesian的结果,表明真实和近台远台之间的典型总差差距对某些神经网络的发电机和导师来说接近零。我们关于模拟最有竞争力的模拟数据的研究结果显示与最有竞争力的模拟性能比。