In applied fields where the speed of inference and model flexibility are crucial, the use of Bayesian inference for models with a stochastic process as their prior, e.g. Gaussian processes (GPs) is ubiquitous. Recent literature has demonstrated that the computational bottleneck caused by GP priors or their finite realizations can be encoded using deep generative models such as variational autoencoders (VAEs), and the learned generators can then be used instead of the original priors during Markov chain Monte Carlo (MCMC) inference in a drop-in manner. While this approach enables fast and highly efficient inference, it loses information about the stochastic process hyperparameters, and, as a consequence, makes inference over hyperparameters impossible and the learned priors indistinct. We propose to resolve the aforementioned issue and disentangle the learned priors by conditioning the VAE on stochastic process hyperparameters. This way, the hyperparameters are encoded alongside GP realisations and can be explicitly estimated at the inference stage. We believe that the new method, termed PriorCVAE, will be a useful tool among approximate inference approaches and has the potential to have a large impact on spatial and spatiotemporal inference in crucial real-life applications. Code showcasing the PriorCVAE technique can be accessed via the following link: https://github.com/elizavetasemenova/PriorCVAE
翻译:在需要推断速度和模型灵活性关键的应用领域中,望文生义的过程先验(例如高斯过程)的贝叶斯推断是普遍的。最近的文献详细阐述了生成模型(如变分自编码器(VAE))可以编码由过程先验或者它们的有限实现引起的计算瓶颈,并且推断时这些生成器可以代替原始先验使用。这种方法能够实现快速而高效的推断,但同时损失了过程超参数的信息。因此,超参数的推断变得不可能,并且生成的先验也会受到干扰。我们提出了一种新的方法,名为 PriorCVAE,通过在过程 VAE 上条件化预测,以显式估计过程先验超参数。因此,超参数被编码到高斯过程实现中,并可以在推断阶段进行明确的估计。我们认为这个新方法将成为近似推断方法中非常有用的工具,并且具有潜在的在空间和时空推断方面对关键的现实应用产生重大影响的可能性。您可以通过以下链接访问展示 PriorCVAE 技术的代码: https://github.com/elizave