Gaussian Processes (GPs) are a versatile and popular method in Bayesian Machine Learning. A common modification are Sparse Variational Gaussian Processes (SVGPs) which are well suited to deal with large datasets. While GPs allow to elegantly deal with Gaussian-distributed target variables in closed form, their applicability can be extended to non-Gaussian data as well. These extensions are usually impossible to treat in closed form and hence require approximate solutions. This paper proposes to approximate the inverse-link function, which is necessary when working with non-Gaussian likelihoods, by a piece-wise constant function. It will be shown that this yields a closed form solution for the corresponding SVGP lower bound. In addition, it is demonstrated how the piece-wise constant function itself can be optimized, resulting in an inverse-link function that can be learnt from the data at hand.
翻译:Gausian Processes (GPs) 是巴伊西亚机器学习中一种多功能和流行的方法。 一个常见的修改是粗略的多用途高斯进程( SVGPs), 它非常适合处理大型数据集。 虽然GPs允许以封闭的形式优雅地处理高斯分布的目标变量, 但它们的适用性也可以扩大到非加西尼亚数据。 这些扩展通常无法以封闭的形式处理, 因此需要近似的解决办法 。 本文建议用一个片断常量函数来比较反链接功能, 这对于与非加西安可能性一起工作十分必要 。 将会显示, 这为相应的 SVGP 下界提供了一种封闭的形式解决方案 。 此外, 演示了如何优化片段常量函数本身, 从而产生从手头数据中学习的反链接功能 。