We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points, which can lead to more scalable algorithms than previous methods. It is based on decomposing a Gaussian process as a sum of two independent processes: one spanned by a finite basis of inducing points and the other capturing the remaining variation. We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new stochastic variational inference algorithms. We demonstrate the efficiency of these algorithms in several Gaussian process models ranging from standard regression to multi-class classification using (deep) convolutional Gaussian processes and report state-of-the-art results on CIFAR-10 among purely GP-based models.
翻译:我们引入了对高斯进程使用诱导点的微小变化近似值的新解释,这可能导致比以往方法更可伸缩的算法,其基础是将高斯进程分解为两个独立进程的总和:一个是有限的诱导点,另一个是捕捉剩余变异。我们表明,这种配方恢复了现有的近似值,同时可以获取关于边际可能性和新的随机变异推法的更严格下限。我们在若干高斯进程模型中展示了这些算法的效率,从标准回归到使用(深的)远古斯进程进行多级分类,并在纯基于GP的模型中报告CIFAR-10的最新结果。