This paper identifies several characteristics of approximate MCMC in Bayesian deep learning. It proposes an approximate sampling algorithm for neural networks. By analogy to sampling data batches from big datasets, it is proposed to sample parameter subgroups from neural network parameter spaces of high dimensions. While the advantages of minibatch MCMC have been discussed in the literature, blocked Gibbs sampling has received less research attention in Bayesian deep learning.
翻译:本文确定了巴伊西亚深层学习中近似MCMC的几种特征,提出了神经网络的大致抽样算法。比照从大数据集中抽样数据组,建议从高度神经网络参数空间取样参数分组。虽然文献中讨论了微型MMC的优势,但被阻断的Gibbs取样在Bayesian深层学习中较少受到研究关注。