Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attractive family of meta-learning models which produce well-calibrated predictions, enable fast inference at test time, and are trainable via a simple maximum likelihood procedure. A limitation of CNPs is their inability to model dependencies in the outputs. This significantly hurts predictive performance and renders it impossible to draw coherent function samples, which limits the applicability of CNPs in down-stream applications and decision making. Neural Processes (NPs; Garnelo et al., 2018) attempt to alleviate this issue by using latent variables, relying on these to model output dependencies, but introduces difficulties stemming from approximate inference. One recent alternative (Bruinsma et al.,2021), which we refer to as the FullConvGNP, models dependencies in the predictions while still being trainable via exact maximum-likelihood. Unfortunately, the FullConvGNP relies on expensive 2D-dimensional convolutions, which limit its applicability to only one-dimensional data. In this work, we present an alternative way to model output dependencies which also lends itself maximum likelihood training but, unlike the FullConvGNP, can be scaled to two- and three-dimensional data. The proposed models exhibit good performance in synthetic experiments.
翻译:有条件神经过程(CNP;Garnelo等人,2018年)是一个具有吸引力的元学习模型组合,这些模型在测试时能够进行精确的预测,在测试时能够快速推断,并且可以通过简单的最大可能性程序加以训练。对于CNP的限制是,它们无法在产出中建模依赖性模型。这极大地伤害了预测性能,使其无法得出一致的功能样本,从而限制了CNP在下游应用和决策中的适用性。神经过程(NP;Garnelo等人,2018年)试图通过利用潜在的变量来缓解这一问题,利用这些变量来模拟产出依赖性,但从大致的推断中引入了困难。最近的一个替代方案(Bruinsma等人,2021年)是,我们称之为“完全ConnationGNP,预测中的模型依赖性模型,而同时仍然可以通过最接近的最大的可能性进行训练。不幸的是,完全的Connational依赖昂贵的2D维层面演进,这限制了其仅适用于一维数据的适用性。在这项工作中,我们提出了一种选择,而不是模型中的一种最有可能的,即模拟的合成数据依赖。