Deep Gaussian processes (DGPs) are popular surrogate models for complex nonstationary computer experiments. DGPs use one or more latent Gaussian processes (GPs) to warp the input space into a plausibly stationary regime, then use typical GP regression on the warped domain. While this composition of GPs is conceptually straightforward, the functional nature of the multi-dimensional latent warping makes Bayesian posterior inference challenging. Traditional GPs with smooth kernels are naturally suited for the integration of gradient information, but the integration of gradients within a DGP presents new challenges and has yet to be explored. We propose a novel and comprehensive Bayesian framework for DGPs with gradients that facilitates both gradient-enhancement and gradient posterior predictive distributions. We provide open-source software in the "deepgp" package on CRAN, with optional Vecchia approximation to circumvent cubic computational bottlenecks. We benchmark our DGPs with gradients on a variety of nonstationary simulations, showing improvement over both GPs with gradients and conventional DGPs.
翻译:深度高斯过程(DGPs)是复杂非平稳计算机实验的常用代理模型。DGPs通过一个或多个潜在高斯过程(GPs)将输入空间扭曲至可能平稳的区间,随后在扭曲域上使用典型的高斯过程回归。尽管这种高斯过程的组合在概念上直观,但多维潜在扭曲的函数特性使得贝叶斯后验推断具有挑战性。传统采用平滑核函数的高斯过程天然适合整合梯度信息,但在DGP中整合梯度带来了新的挑战且尚未被探索。我们提出了一种新颖且完备的梯度增强深度高斯过程贝叶斯框架,该框架同时支持梯度增强与梯度后验预测分布。我们在CRAN平台的"deepgp"软件包中提供了开源实现,并采用可选的Vecchia近似以规避立方级计算瓶颈。我们在多种非平稳仿真实验中对比测试了梯度增强深度高斯过程的性能,结果表明其优于梯度增强高斯过程与传统深度高斯过程。