Bayesian neural networks perform variational inference over the weights however calculation of the posterior distribution remains a challenge. Our work builds on variational inference techniques for bayesian neural networks using the original Evidence Lower Bound. In this paper, we present a stochastic bayesian neural network in which we maximize Evidence Lower Bound using a new objective function which we name as Stochastic Evidence Lower Bound. We evaluate our network on 5 publicly available UCI datasets using test RMSE and log likelihood as the evaluation metrics. We demonstrate that our work not only beats the previous state of the art algorithms but is also scalable to larger datasets.
翻译:Bayesian神经网络对重量进行不同的推论,然而,对后天分布的计算仍是一项挑战。我们的工作建立在使用原始证据下界的海湾神经网络的变推法技术之上。在本文中,我们展示了一种随机孔径刺神经网络,我们在这个网络中利用我们称为Stochatic证据下层曲线的新客观功能最大限度地增加证据下角孔径。我们利用测试RMSE和记录作为评估指标的可能性,对5个公开提供的UCI数据集进行了我们网络的评估。我们证明我们的工作不仅比艺术算法的先前状态强,而且可以向更大的数据集扩展。