In Bayesian inference prior hyperparameters are chosen subjectively or estimated using empirical Bayes methods. Generalised Bayesian Inference also has hyperparameters (the learning rate, and parameters of the loss). As part of the Generalised-Bayes workflow it is necessary to check sensitivity to the choice of hyperparameters, but running MCMC or fitting a variational approximation at each hyperparameter setting is impractical when there are more than a few hyperparameters. Simulation Based Inference has been used to amortise over data and hyperparameters and can be useful for Bayesian problems. However, there is no Simulation Based Inference for Generalised Bayes posteriors, as there is no generative model for the data. Working with a variational family parameterised by a normalising flow, we show how to fit a variational Generalised Bayes posterior, amortised over all hyperparameters. This may be sampled very efficiently at different hyperparameter values without refitting, and supports efficient robustness checks and hyperparameter selection. We show that there exist amortised normalising-flow architectures which are universal approximators. We test our approach on a relatively large-scale application of Generalised Bayesian Inference. The code is available online.
翻译:暂无翻译