Gaussian processes are a flexible Bayesian nonparametric modelling approach that has been widely applied but poses computational challenges. To address the poor scaling of exact inference methods, approximation methods based on sparse Gaussian processes (SGP) are attractive. An issue faced by SGP, especially in latent variable models, is the inefficient learning of the inducing inputs, which leads to poor model prediction. We propose a regularization approach by balancing the reconstruction performance of data and the approximation performance of the model itself. This regularization improves both inference and prediction performance. We extend this regularization approach into latent variable models with SGPs and show that performing variational inference (VI) on those models is equivalent to performing VI on a related empirical Bayes model.
翻译:Gausian进程是一种灵活的Bayesian非参数建模方法,已经广泛应用,但提出了计算上的挑战。为了解决精确推算方法规模不足的问题,基于稀疏高斯进程(SGP)的近似方法具有吸引力。SGP面临的一个问题是对诱导投入的学习效率低下,特别是潜伏变量模型,这导致模型预测不力。我们提出一种正规化方法,在数据重建绩效与模型本身近似绩效之间取得平衡。这种正规化既能改善推论,又能改善预测绩效。我们将这一正规化方法推广到与SGP一起的潜在变量模型中,并表明对这些模型进行变异推论(VI)相当于在相关的实证海湾模型上进行第六级演算。