Restricted Boltzmann Machines (RBMs) are probabilistic generative models that can be trained by maximum likelihood in principle, but are usually trained by an approximate algorithm called Contrastive Divergence (CD) in practice. In general, a CD-k algorithm estimates an average with respect to the model distribution using a sample obtained from a k-step Markov Chain Monte Carlo Algorithm (e.g., block Gibbs sampling) starting from some initial configuration. Choices of k typically vary from 1 to 100. This technical report explores if it's possible to leverage a simple approximate sampling algorithm with a modified version of CD in order to train an RBM with k=0. As usual, the method is illustrated on MNIST.
翻译:受限制的波尔兹曼机器(RBMs)是概率性基因化模型,原则上可以以最大的可能性加以培训,但通常在实际中接受一种称为对比变异(CD)的近似算法的培训,一般来说,CD-k算法使用从K-step Markov Chain Monte Carlo Algorithm(例如,块状Gibbs取样)开始从某种初始配置开始的样本,估计模型分布的平均值。 k 的选择通常从1到100不等。本技术报告探讨能否利用一个简单的近似采样算法,用修改版的CD(CD)来用K=0来训练成果管理制。与通常一样,这种方法用MNISTS来说明。