We propose a deep generative approach to sampling from a conditional distribution based on a unified formulation of conditional distribution and generalized nonparametric regression function using the noise-outsourcing lemma. The proposed approach aims at learning a conditional generator so that a random sample from the target conditional distribution can be obtained by the action of the conditional generator on a sample drawn from a reference distribution. The conditional generator is estimated nonparametrically with neural networks by matching appropriate joint distributions using the Kullback-Liebler divergence. An appealing aspect of our method is that it allows either of or both the predictor and the response to be high-dimensional and can handle both continuous and discrete type predictors and responses. We show that the proposed method is consistent in the sense that the conditional generator converges in distribution to the underlying conditional distribution under mild conditions. Our numerical experiments with simulated and benchmark image data validate the proposed method and demonstrate that it outperforms several existing conditional density estimation methods.
翻译:我们建议采用一种深层次的遗传方法,在采用无噪音外包 Lemma 的统一配方的基础上,从有条件分布和普遍非参数回归功能统一配制的基础上,对有条件分布进行取样。拟议方法旨在学习一个有条件的发电机,以便有条件分配目标的随机样本能够通过有条件的发电机在从参考分布中抽取的样本中的行动获得。有条件的发电机与神经网络进行非对称估计,方法是利用Kullback-Liebler差异将适当的联合分布匹配。我们方法的一个引人注意的方面是,它允许或同时允许预测器和反应具有高维度,能够同时处理连续和离散类型预测器和反应。我们表明,拟议方法是一致的,即有条件的发电机在轻度条件下将分布集中到基本有条件分布。我们用模拟和基准图像数据进行的数字实验验证了拟议方法,并证明它超越了现有的几种有条件密度估计方法。