Standard GPs offer a flexible modelling tool for well-behaved processes. However, deviations from Gaussianity are expected to appear in real world datasets, with structural outliers and shocks routinely observed. In these cases GPs can fail to model uncertainty adequately and may over-smooth inferences. Here we extend the GP framework into a new class of time-changed GPs that allow for straightforward modelling of heavy-tailed non-Gaussian behaviours, while retaining a tractable conditional GP structure through an infinite mixture of non-homogeneous GPs representation. The conditional GP structure is obtained by conditioning the observations on a latent transformed input space and the random evolution of the latent transformation is modelled using a L\'{e}vy process which allows Bayesian inference in both the posterior predictive density and the latent transformation function. We present Markov chain Monte Carlo inference procedures for this model and demonstrate the potential benefits compared to a standard GP.
翻译:标准GP为良好行为过程提供了一个灵活的模型工具。 但是,与高斯的偏差预计将出现在现实世界的数据集中,并经常观察到结构性的异常值和冲击。 在这些情况下,GP可能无法充分模拟不确定性,而且可能超光速推断。 我们在这里将GP框架扩展为一个新的时间变化的GP类别,从而可以直接模拟重尾非加西人的行为,同时保留一个有条件的有条件的GP结构,通过无尽混合的非均匀的GP代表组合来保持。 有条件的GP结构是通过在潜在转换输入空间上进行观察而获得的,而潜在变异的随机演模式是使用一种L\'{e}rvy过程来模拟的,使Bayesian人能够在后方预测密度和潜在变异函数中作出推断。 我们介绍了该模型的Markov链 Monte Carlo推论程序,并展示了与标准GPGP相比的潜在效益。