Gaussian Processes (GPs) can be used as flexible, non-parametric function priors. Inspired by the growing body of work on Normalizing Flows, we enlarge this class of priors through a parametric invertible transformation that can be made input-dependent. Doing so also allows us to encode interpretable prior knowledge (e.g., boundedness constraints). We derive a variational approximation to the resulting Bayesian inference problem, which is as fast as stochastic variational GP regression (Hensman et al., 2013; Dezfouli and Bonilla,2015). This makes the model a computationally efficient alternative to other hierarchical extensions of GP priors (Lazaro-Gredilla,2012; Damianou and Lawrence, 2013). The resulting algorithm's computational and inferential performance is excellent, and we demonstrate this on a range of data sets. For example, even with only 5 inducing points and an input-dependent flow, our method is consistently competitive with a standard sparse GP fitted using 100 inducing points.
翻译:Gausian processes (GPs) 可以用灵活、非参数函数前置法来使用。 在正常流程方面越来越多的工作启发下, 我们通过一个可以使输入依赖的参数不可倒置的变换来扩大这一类前置法。 这样做也使我们能够编码可解释的先前知识( 例如, 界限限制 ) 。 我们从由此产生的巴耶斯推论问题中得出一个变异近似值, 其速度与随机变异性GP回归一样快( Hensman等人, 2013; Dezfouli 和 Bonilla, 2015)。 这使得模型在计算上高效地替代GP的前置( Lazaro- Gredilla, 2012; Damianou 和 Lawrence, 2013)。 由此得出的算法的计算和推断性性能非常好, 我们在一系列数据集中展示了这一点。 例如, 我们的方法即使只有5个引点和输入源流, 也始终具有竞争力, 与使用100引引点的标准稀散的GP具有竞争力。