Bayesian Generative AI (BayesGen-AI) methods are developed and applied to Bayesian computation. BayesGen-AI reconstructs the posterior distribution by directly modeling the parameter of interest as a mapping (a.k.a. deep learner) from a large simulated dataset. This provides a generator that we can evaluate at the observed data and provide draws from the posterior distribution. This method applies to all forms of Bayesian inference including parametric models, likelihood-free models, prediction and maximum expected utility problems. Bayesian computation is then equivalent to high dimensional non-parametric regression. Bayes Gen-AI main advantage is that it is density-free and therefore provides an alternative to Markov Chain Monte Carlo. It has a number of advantages over vanilla generative adversarial networks (GAN) and approximate Bayesian computation (ABC) methods due to the fact that the generator is simpler to learn than a GAN architecture and is more flexible than kernel smoothing implicit in ABC methods. Design of the Network Architecture requires careful selection of features (a.k.a. dimension reduction) and nonlinear architecture for inference. As a generic architecture, we propose a deep quantile neural network and a uniform base distribution at which to evaluate the generator. To illustrate our methodology, we provide two real data examples, the first in traffic flow prediction and the second in building a surrogate for satellite drag data-set. Finally, we conclude with directions for future research.
翻译:暂无翻译