In this work we introduce a reduced-rank algorithm for Gaussian process regression. Our numerical scheme converts a Gaussian process on a user-specified interval to its Karhunen-Lo\`eve expansion, the $L^2$-optimal reduced-rank representation. Numerical evaluation of the Karhunen-Lo\`eve expansion is performed once during precomputation and involves computing a numerical eigendecomposition of an integral operator whose kernel is the covariance function of the Gaussian process. The Karhunen-Lo\`eve expansion is independent of observed data and depends only on the covariance kernel and the size of the interval on which the Gaussian process is defined. The scheme of this paper does not require translation invariance of the covariance kernel. We also introduce a class of fast algorithms for Bayesian fitting of hyperparameters, and demonstrate the performance of our algorithms with numerical experiments in one and two dimensions. Extensions to higher dimensions are mathematically straightforward but suffer from the standard curses of high dimensions.
翻译:在这项工作中,我们引入了高斯进程回归的降级算法。 我们的数字方案将用户指定的间隔的高斯进程转换为 Karhunen- Lo ⁇ ⁇ eve 扩展, 即 $L $2$- 最佳降级代表。 对 Karhunen- Lo ⁇ ⁇ eve 扩展的数值评估在计算前一次, 包括计算一个内核是高斯进程共性功能的集成操作员的数值变异。 Karhunen- Lo ⁇ ⁇ eve 扩展独立于观测到的数据, 仅取决于高斯进程定义的共变内核和间隔的大小。 本文的计算方案不需要对共变内核进行翻译。 我们还为巴伊西亚人安装超参数引入了一组快速算法, 并在一个和两个层面的数值实验中展示了我们的算法的性能。 向更高层面的扩展在数学上是直截然的, 但受高维的标准诅咒的影响。