Bayesian model reduction provides an efficient approach for comparing the performance of all nested sub-models of a model, without re-evaluating any of these sub-models. Until now, Bayesian model reduction has been applied mainly in the computational neuroscience community. In this paper, we formulate and apply Bayesian model reduction to perform principled pruning of Bayesian neural networks, based on variational free energy minimization. This novel parameter pruning scheme solves the shortcomings of many current state-of-the-art pruning methods that are used by the signal processing community. The proposed approach has a clear stopping criterion and minimizes the same objective that is used during training. Next to these theoretical benefits, our experiments indicate better model performance in comparison to state-of-the-art pruning schemes.
翻译:贝叶斯模型的减少为比较模型所有嵌套子模型的性能提供了一种有效的方法,而没有对这些子模型作任何重新评估。到目前为止,贝叶斯模型的减少主要应用于计算神经科学界。在本文中,我们制定并应用了贝叶斯模型的减少方法,以基于无变自由能源最小化原则对贝叶斯神经网络进行修剪。这个新的参数修剪方案解决了信号处理界目前使用的许多最先进的裁剪方法的缺点。拟议的方法有一个明确的停用标准,并尽量减少培训中使用的同样目标。除了这些理论的好处外,我们的实验还表明,与最先进的裁剪方法相比,模型的性能更好。