We propose a posterior for Bayesian Likelihood-Free Inference (LFI) based on generalized Bayesian inference. To define the posterior, we use Scoring Rules (SRs), which evaluate probabilistic models given an observation. In LFI, we can sample from the model but not evaluate the likelihood; hence, we employ SRs which admit unbiased empirical estimates. We use the Energy and Kernel SRs, for which our posterior enjoys consistency in a well-specified setting and outlier robustness. We perform inference with pseudo-marginal (PM) Markov Chain Monte Carlo (MCMC) or stochastic-gradient (SG) MCMC. While PM-MCMC works satisfactorily for simple setups, it mixes poorly for concentrated targets. Conversely, SG-MCMC requires differentiating the simulator model but improves performance over PM-MCMC when both work and scales to higher-dimensional setups as it is rejection-free. Although both techniques target the SR posterior approximately, the error diminishes as the number of model simulations at each MCMC step increases. In our simulations, we employ automatic differentiation to effortlessly differentiate the simulator model. We compare our posterior with related approaches on standard benchmarks and a chaotic dynamical system from meteorology, for which SG-MCMC allows inferring the parameters of a neural network used to parametrize a part of the update equations of the dynamical system.
翻译:暂无翻译