We establish a general form of explicit, input-dependent, measure-valued warpings for learning nonstationary kernels. While stationary kernels are ubiquitous and simple to use, they struggle to adapt to functions that vary in smoothness with respect to the input. The proposed learning algorithm warps inputs as conditional Gaussian measures that control the smoothness of a standard stationary kernel. This construction allows us to capture non-stationary patterns in the data and provides intuitive inductive bias. The resulting method is based on sparse spectrum Gaussian processes, enabling closed-form solutions, and is extensible to a stacked construction to capture more complex patterns. The method is extensively validated alongside related algorithms on synthetic and real world datasets. We demonstrate a remarkable efficiency in the number of parameters of the warping functions in learning problems with both small and large data regimes.
翻译:我们为学习非静止内核建立了一种普通的、以投入为依存的、有计量价值的、明确、依赖投入的扭曲法,用于学习非静止内核。虽然固定内核是无处不在的、容易使用的,但是它们很难适应在输入方面不光滑的功能。拟议的学习算法将输入扭曲法作为有条件的高斯测量措施,以控制标准固定内核的平稳性。这种构造使我们能够捕捉数据中非静止的形态,并提供直觉的感应偏差。由此产生的方法以稀薄频谱高斯过程为基础,使封闭式解决办法得以实现,并可以推广到堆叠的构造中,以捕捉更复杂的模式。该方法与合成和真实世界数据集的相关算法一起得到广泛验证。我们显示了在与小型和大型数据体系的学习问题中,扭曲功能的参数数量非常有效。