Neural Processes (NPs) are a family of conditional generative models that are able to model a distribution over functions, in a way that allows them to perform predictions at test time conditioned on a number of context points. A recent addition to this family, Convolutional Conditional Neural Processes (ConvCNP), have shown remarkable improvement in performance over prior art, but we find that they sometimes struggle to generalize when applied to time series data. In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future. By incorporating a Gaussian Process into the model, we are able to remedy this and at the same time improve performance within distribution. As an added benefit, the Gaussian Process reintroduces the possibility to sample from the model, a key feature of other members in the NP family.
翻译:神经过程(NPs)是一个由有条件基因模型组成的大家庭,它能够模拟功能的分布,从而使其能够在试验时间根据若干背景点进行预测。最近对这一家族的附加,即进化条件神经过程(ConvCNP),显示出与以前艺术相比业绩的显著改善,但我们发现它们有时难以在应用时间序列数据时加以概括。特别是,它们对于分布变化并不强大,无法将观察到的模式外推到未来。通过将高斯过程纳入模型,我们能够纠正这一点,同时改善分布中的绩效。作为附加的好处,高斯过程重新引入了从模型中取样的可能性,这是NP家族其他成员的一个关键特征。