Learning latent variable models with stochastic variational inference is challenging when the approximate posterior is far from the true posterior, due to high variance in the gradient estimates. We propose a novel rejection sampling step that discards samples from the variational posterior which are assigned low likelihoods by the model. Our approach provides an arbitrarily accurate approximation of the true posterior at the expense of extra computation. Using a new gradient estimator for the resulting unnormalized proposal distribution, we achieve average improvements of 3.71 nats and 0.21 nats over state-of-the-art single-sample and multi-sample alternatives respectively for estimating marginal log-likelihoods using sigmoid belief networks on the MNIST dataset.
翻译:由于梯度估计值差异很大,当近似后继物远非真实的后继物时,学习具有随机变异推断的潜在变量模型是具有挑战性的。 我们提出一个新的拒绝采样步骤,将模型所分配的低概率变异后继体的样本丢弃掉。 我们的方法是任意准确地接近真实后继物,而牺牲额外计算。 使用一个新的梯度估计器来计算由此产生的非正常的投标书分布,我们分别对最先进的单抽样和多抽样替代品平均改进了3.71纳特和0.21纳特,以便利用MNIST数据集上的类信仰网络估计边缘原木类。