Stationary stochastic processes (SPs) are a key component of many probabilistic models, such as those for off-the-grid spatio-temporal data. They enable the statistical symmetry of underlying physical phenomena to be leveraged, thereby aiding generalization. Prediction in such models can be viewed as a translation equivariant map from observed data sets to predictive SPs, emphasizing the intimate relationship between stationarity and equivariance. Building on this, we propose the Convolutional Neural Process (ConvNP), which endows Neural Processes (NPs) with translation equivariance and extends convolutional conditional NPs to allow for dependencies in the predictive distribution. The latter enables ConvNPs to be deployed in settings which require coherent samples, such as Thompson sampling or conditional image completion. Moreover, we propose a new maximum-likelihood objective to replace the standard ELBO objective in NPs, which conceptually simplifies the framework and empirically improves performance. We demonstrate the strong performance and generalization capabilities of ConvNPs on 1D regression, image completion, and various tasks with real-world spatio-temporal data.
翻译:固定的随机过程(SPs)是许多概率模型的关键组成部分,例如离电网空间时空数据模型。这些模型能够使基本物理现象的统计对称得到利用,从而有助于概括化。这些模型的预测可以被视为从观察的数据集到预测SP的翻译等同分布图,强调静态和等同之间的亲密关系。在此基础上,我们提议进化神经过程(ConvNP)为神经过程(NPs)带来内向过程(NPs),具有翻译等同性,并扩展进进进有条件的NPs,以允许预测分布中的可靠性。后者使ConvNPs能够在需要连贯样本的环境中部署,例如软体取样或有条件图像完成。此外,我们提出一个新的最大可能性目标,以取代NPs的标准ELBO目标,该目标在概念上简化了框架,从经验上改进了绩效。我们展示了1DSoalimion-Gs的强大性能和一般化能力,并附有真实数据回归、图像完成和各种任务。