The Gaussianity assumption has been pointed out as the main limitation of the Variational AutoEncoder (VAE) in spite of its usefulness in computation. To improve the distributional capacity (i.e., expressive power of distributional family) of the VAE, we propose a new VAE learning method with a nonparametric distributional assumption on its generative model. By estimating an infinite number of conditional quantiles, our proposed VAE model directly estimates the conditional cumulative distribution function, and we call this approach distributional learning of the VAE. Furthermore, by adopting the continuous ranked probability score (CRPS) loss, our proposed learning method becomes computationally tractable. To evaluate how well the underlying distribution of the dataset is captured, we apply our model for synthetic data generation based on inverse transform sampling. Numerical results with real tabular datasets corroborate our arguments.
翻译:高斯度假设尽管在计算中有用,但被指出是变式自动计算器的主要限制。为了提高VAE的分布能力(即分布式家庭的表现力),我们提出了一种新的VAE学习方法,在基因模型上采用非对称分布式假设。我们提议的VAE模型估计了无限数量的有条件量化,直接估计了有条件的累积分布功能,我们称之为VAE的这种方法分布式学习。此外,通过采用连续排序概率分数(CRPS)损失,我们提议的学习方法在计算上变得可移动。为了评估数据集基本分布的准确性,我们应用了我们基于逆变抽样的合成数据生成模型。用真实的表格数据集得出的数字结果证实了我们的论点。