We develop amortized population Gibbs (APG) samplers, a class of scalable methods that frames structured variational inference as adaptive importance sampling. APG samplers construct high-dimensional proposals by iterating over updates to lower-dimensional blocks of variables. We train each conditional proposal by minimizing the inclusive KL divergence with respect to the conditional posterior. To appropriately account for the size of the input data, we develop a new parameterization in terms of neural sufficient statistics. Experiments show that APG samplers can train highly structured deep generative models in an unsupervised manner, and achieve substantial improvements in inference accuracy relative to standard autoencoding variational methods.
翻译:我们开发了分解人口Gibbs(APG)采样器,这是一组可缩放的方法,将结构化变异推论作为适应性重要取样。APG采样器通过对低维变量区块的更新进行迭代来构建高维建议。我们通过最大限度地减少有条件后继器的包容性 KL差异来培训每一项有条件建议。为了适当计算输入数据的规模,我们开发了一个新的神经充足统计数据参数。实验显示,APG采样器可以以不受监督的方式培养高度结构化的深层基因化模型,并大大改进相对于标准的自动编码变异方法的推断准确性。