The Kullback-Leibler (KL) divergence is widely used for the variational inference of Bayesian Neural Networks (BNNs) to approximate the posterior distribution of weights. However, the KL divergence is unbounded and asymmetric, which may lead to instabilities during optimization or may yield poor generalizations. To overcome these limitations, we examine the Jensen-Shannon (JS) divergence that is more general, bounded, and symmetric. Towards this, we propose two novel loss functions for BNNs: 1) a geometric JS divergence (JS-G) based loss function that is symmetric but unbounded with closed-form expression for Gaussian priors and 2) a generalized JS divergence (JS-A) based loss function that is symmetric and bounded. We show that the conventional KL divergence-based loss function is a special case of the loss functions presented in this work. To evaluate the divergence part of the proposed JS-G-based loss function, we use an exact closed-form expression for Gaussian priors. For any other priors of JS-G and for the JS-A-based loss function we use Monte Carlo approximation. We provide algorithms to optimize the loss function using both these methods. The proposed loss functions offer additional parameters that can be tuned to control the regularisation. We explain the reason why the proposed loss functions should perform better than the state-of-the-art. Further, we derive the conditions under which the proposed JS-G-loss function regularises better than the KL divergence-based loss function for Gaussian priors and posteriors. The proposed JS divergence-based Bayesian convolutional neural networks (BCNN) perform better than the state-of-the-art BCNN, which is shown for the classification of the CIFAR data set having various degrees of noise and a biased histopathology data set.
翻译:Kullback- Leiber (KL) 差异被广泛用于 Bayesian Neal Networks (JS-GNS) 的变异推断值。 然而, KL 差异没有限制且不对称, 可能导致优化期间的不稳定, 或可能导致简单化。 为了克服这些限制, 我们检查了 Jensen- Shannon (JS) 差异, 该差异更为笼统、 受约束和对称。 为此, 我们建议BNS 有两个新的损失函数:1) 基于 JS- GNS (JS-G) 的基于几何的 JS- G) 偏差( JS- GNS) 基础损失函数, 具有对称性, 但是没有限制 Gassian 先前和2 之前的封闭式表达式表达式表达式表达式表达式表达式。 通用JS- A (JS-A) 常规的偏差(JS- A) 以更精确的表达式表达式表达式表达式表达式表达式功能, 用于之前的变现变换损失功能。