Gaussian processes (GPs), implemented through multivariate Gaussian distributions for a finite collection of data, are the most popular approach in small-area spatial statistical modelling. In this context they are used to encode correlation structures over space and can generalise well in interpolation tasks. Despite their flexibility, off-the-shelf GPs present serious computational challenges which limit their scalability and practical usefulness in applied settings. Here, we propose a novel, deep generative modelling approach to tackle this challenge, termed PriorVAE: for a particular spatial setting, we approximate a class of GP priors through prior sampling and subsequent fitting of a variational autoencoder (VAE). Given a trained VAE, the resultant decoder allows spatial inference to become incredibly efficient due to the low dimensional, independently distributed latent Gaussian space representation of the VAE. Once trained, inference using the VAE decoder replaces the GP within a Bayesian sampling framework. This approach provides tractable and easy-to-implement means of approximately encoding spatial priors and facilitates efficient statistical inference. We demonstrate the utility of our VAE two stage approach on Bayesian, small-area estimation tasks.
翻译:Gaussian进程(GPs)是通过用于有限数据收集的多变Gaussian分布系统实施的,是小型空间空间统计建模中最受欢迎的方法,在这方面,这些流程被用于对空间相关结构进行编码,并能够对内插任务进行全面概括。尽管这些流程具有灵活性,但现成的GPs提出了严重的计算挑战,限制了其在应用环境中的可缩放性和实用性。在这里,我们提议了一种新型的、深层次的基因建模方法来应对这一挑战,称为PrealVAE:就特定空间环境而言,我们通过事先取样和随后安装变异自动计算器(VAE),大致接近一组GPs前科前科。鉴于经过经过培训后,结果解码器使得空间变异因VAE的低度、独立分布的潜层高地空间代表而变得极其有效。我们经过培训后,用VAE解码器替代了Bayesian取样框架的GPs:这一方法提供了大约可移植和易于执行的GPPS-A前置空间前置方法。