In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for stochastic processes with complex distributions. Conventional inferential methods for DGP models can suffer from high computational complexity as they require large-scale operations with kernel matrices for training and inference. In this work, we propose an efficient scheme for accurate inference and efficient training based on a range of Gaussian Processes, called the Tensor Markov Gaussian Processes (TMGP). We construct an induced approximation of TMGP referred to as the hierarchical expansion. Next, we develop a deep TMGP (DTMGP) model as the composition of multiple hierarchical expansion of TMGPs. The proposed DTMGP model has the following properties: (1) the outputs of each activation function are deterministic while the weights are chosen independently from standard Gaussian distribution; (2) in training or prediction, only polylog(M) (out of M) activation functions have non-zero outputs, which significantly boosts the computational efficiency. Our numerical experiments on synthetic models and real datasets show the superior computational efficiency of DTMGP over existing DGP models.
翻译:在这项工作中,我们使用Deep Gausian Processes(DGPs)作为复杂分布的抽查工艺的统计代谢器。 DGP模型的常规推断方法可能因计算复杂程度高而受到影响,因为它们需要用内核矩阵进行大规模操作,以便进行培训和推断。在这项工作中,我们提出了一个基于一系列高森进程(称为Tensor Markov Gaussian Processes (TMGP))的准确推断和有效培训的有效计划。我们建造了称为等级扩张的TMGP(DTM) 的诱导近似值。接下来,我们开发了一个深度的TMGP(DTMGP) 模型,作为TMGP多级扩张的构成。提议的DTMGP模型具有以下特性:(1) 每项激活功能的输出是确定性的,而重量的选择独立于标准高标分布;(2) 在培训或预测中,只有多边(M) 启动功能的非零产出,这大大提升了计算效率。我们关于合成模型的数值模型和真实数据组的计算效率,显示现有的高级GPGPGP的计算效率。