Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power of their single-layer counterpart. However, it is impossible to perform exact inference in DGPs, which has motivated the recent development of variational inference-based methods. Unfortunately, either these methods yield a biased posterior belief or it is difficult to evaluate their convergence. This paper introduces a new approach for specifying flexible, arbitrarily complex, and scalable approximate posterior distributions. The posterior distribution is constructed through a normalizing flow (NF) which transforms a simple initial probability into a more complex one through a sequence of invertible transformations. Moreover, a novel convolutional normalizing flow (CNF) is developed to improve the time efficiency and capture dependency between layers. Empirical evaluation shows that CNF DGP outperforms the state-of-the-art approximation methods for DGPs.
翻译:深高斯进程(DGPs)是GP模型的等级构成,它成功地提高了其单层对应方的表达力。然而,无法在DGP中进行精确的推论,而DGP正是这种推论的动因最近发展了基于不同推论的方法。不幸的是,无论是这些方法产生偏颇的后背信仰,还是难以评价其趋同性。本文介绍了一种新的方法,以具体说明灵活、任意复杂和可缩放的近距离近距离分布。后方分布是通过正常化流(NF)构建的,通过不可逆的转换序列将简单的初始概率转化为更复杂的概率。此外,正在开发一种新的革命性正常流(CNF),以提高时间效率和捕捉不同层之间的依赖性。经验性评估显示,CNF DGP(DGP)超越了DGP(PGP)的状态近似方法。