We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations: based on random Fourier features, and based on truncating the kernel's Mercer expansion. In particular, we bound the Kullback-Leibler divergence between an exact GP and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities, and we also bound the error between predictive mean vectors and between predictive covariance matrices computed using the exact versus using the approximate GP. We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds.
翻译:我们为两种共同的低空内核近似值(GP)回归提供了保证:基于随机的Fourier特征,基于缩短内核的Mercer扩张。 特别是,我们将准确的GP和前述的低级近似值之一造成的Kullback-Leiber差异与它的内核以及相应的预测密度联系起来,我们还将预测平均矢量和使用精确度计算的预测性共变矩阵之间的误差与使用近似GP的误差联系在一起。 我们提供了模拟数据和标准基准方面的实验,以评估我们理论界限的有效性。