This work identifies the existence and cause of a type of posterior collapse that frequently occurs in the Bayesian deep learning practice. For a general linear latent variable model that includes linear variational autoencoders as a special case, we precisely identify the nature of posterior collapse to be the competition between the likelihood and the regularization of the mean due to the prior. Our result also suggests that posterior collapse may be a general problem of learning for deeper architectures and deepens our understanding of Bayesian deep learning.
翻译:这项工作确定了在巴伊西亚深层学习实践中经常发生的后期崩溃的存在和原因。对于包括线性变异自动电解码器在内的一般线性潜伏变量模型来说,我们精确地确定后期崩溃的性质是因先前原因该平均值的可能性和正规化之间的竞争。我们的结果还表明后期崩溃可能是学习更深层建筑和加深我们对巴伊西亚深层学习的理解的一般问题。