Bayesian neural networks provide a direct and natural way to extend standard deep neural networks to support probabilistic deep learning through the use of probabilistic layers that, traditionally, encode weight (and bias) uncertainty. In particular, hybrid Bayesian neural networks utilize standard deterministic layers together with few probabilistic layers judicially positioned in the networks for uncertainty estimation. A major aspect and benefit of Bayesian inference is that priors, in principle, provide the means to encode prior knowledge for use in inference and prediction. However, it is difficult to specify priors on weights since the weights have no intuitive interpretation. Further, the relationships of priors on weights to the functions computed by networks are difficult to characterize. In contrast, functions are intuitive to interpret and are direct since they map inputs to outputs. Therefore, it is natural to specify priors on functions to encode prior knowledge, and to use them in inference and prediction based on functions. To support this, we propose hybrid Bayesian neural networks with functional probabilistic layers that encode function (and activation) uncertainty. We discuss their foundations in functional Bayesian inference, functional variational inference, sparse Gaussian processes, and sparse variational Gaussian processes. We further perform few proof-of-concept experiments using GPflus, a new library that provides Gaussian process layers and supports their use with deterministic Keras layers to form hybrid neural network and Gaussian process models.
翻译:Bayesian神经网络提供了一种直接和自然的方式,以扩展标准的深神经网络,从而通过使用通常会编码重量(和偏差)不确定性的概率层,支持概率深度学习。特别是,混合的Bayesian神经网络使用标准的确定层,同时在不确定性估算的网络中也很少有概率性层,因此,Bayesian神经网络的一个主要方面和好处是,在原则上,为使用推断和预测的先前知识提供编码手段。然而,由于重量没有直观解释,因此很难具体说明重量的比重。此外,混合的Bayesian神经网络与网络计算功能的重量之间的关系很难定性。相比之下,这些功能不适宜于解释和直接的概率层,因为它们绘制了对产出的投入。因此,自然需要说明先前的编码功能,并使用基于功能的推断和预测。为了支持这一点,我们建议混合的Bayesian神经网络与功能性稳定层的关系,在功能性层次上使用功能性层次(和深度变变),我们讨论其功能性基础和深度变变。