Bayesian neural networks (BNNs) augment deep networks with uncertainty quantification by Bayesian treatment of the network weights. However, such models face the challenge of Bayesian inference in a high-dimensional and usually over-parameterized space. This paper investigates a new line of Bayesian deep learning by performing Bayesian inference on network structure. Instead of building structure from scratch inefficiently, we draw inspirations from neural architecture search to represent the network structure. We then develop an efficient stochastic variational inference approach which unifies the learning of both network structure and weights. Empirically, our method exhibits competitive predictive performance while preserving the benefits of Bayesian principles across challenging scenarios. We also provide convincing experimental justification for our modeling choice.
翻译:Bayesian神经网络(BNNs)通过Bayesian对网络重量的处理,通过Bayesian对网络重量的处理,通过不确定性的量化,扩大深层网络;然而,这些模型面临着在高维和通常超分空间中贝ysian推论的挑战。本文通过对网络结构进行Bayesian推论,调查了Bayesian深度学习的一条新线。我们不是从零到零地建立结构,而是从神经结构搜索中汲取灵感,以代表网络结构。然后,我们开发一种高效的随机多变推论方法,将网络结构和重量的学习统一起来。我们的方法具有竞争性地展示了预测性,同时在整个富有挑战的情景中维护了Bayesian原则的好处。我们还为我们的建模选择提供了令人信服的实验性理由。