Gaussian processes provide an elegant framework for specifying prior and posterior distributions over functions. They are, however, also computationally expensive, and limited by the expressivity of their covariance function. We propose Neural Diffusion Processes (NDPs), a novel approach based upon diffusion models, that learn to sample from distributions over functions. Using a novel attention block, we can incorporate properties of stochastic processes, such as exchangeability, directly into the NDP's architecture. We empirically show that NDPs are able to capture functional distributions that are close to the true Bayesian posterior of a Gaussian process. This enables a variety of downstream tasks, including hyperparameter marginalisation and Bayesian optimisation.
翻译:Gausian 进程为指定函数的先前和后端分布提供了优雅的框架。 但是,它们也计算成本昂贵,并受到共变功能的表达性的限制。 我们提出神经扩散过程(NDPs),这是基于扩散模型的一种新颖方法,可以对函数的分布进行抽样研究。 我们可以使用一个新的关注块,将随机过程的特性,如可交换性,直接纳入国家发展方案的架构。 我们的经验显示,国家发展方案能够捕捉接近真正的戈斯进程巴耶西亚后端的功能分布。 这使得可以进行各种下游任务, 包括超参数边缘化和巴耶斯优化。