The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.
翻译:变式自动编码器是一种定义明确的深层遗传模型,它使用编码器-解码器框架,在这个框架中,编码神经网络输出一种非确定性代码来重建输入。编码器通过从每种输入的分布中取样来实现这一点,而不是输出每个输入的确定性代码。这一过程的优点是,它允许将网络用作从所提供的培训样本之外的数据分布中取样的遗传模型。我们在此工作中显示,利用批量正常化作为非确定性的来源足以将确定性自动编码器转化为与变异输入相同的基因模型,只要我们在培训目标中加入适当的昆虫正规化。