Learning high-dimensional distributions is often done with explicit likelihood modeling or implicit modeling via minimizing integral probability metrics (IPMs). In this paper, we expand this learning paradigm to stochastic orders, namely, the convex or Choquet order between probability measures. Towards this end, we introduce the Choquet-Toland distance between probability measures, that can be used as a drop-in replacement for IPMs. We also introduce the Variational Dominance Criterion (VDC) to learn probability measures with dominance constraints, that encode the desired stochastic order between the learned measure and a known baseline. We analyze both quantities and show that they suffer from the curse of dimensionality and propose surrogates via input convex maxout networks (ICMNs), that enjoy parametric rates. Finally, we provide a min-max framework for learning with stochastic orders and validate it experimentally on synthetic and high-dimensional image generation, with promising results. The code is available at https://github.com/yair-schiff/stochastic-orders-ICMN
翻译:在本文中,我们将这种学习范式扩展为随机命令,即概率测量方法之间的二次曲线或Choquet顺序。为此,我们引入了概率测量方法之间的Choquet-Toland距离,可用作IPMs的低位替代物。我们还引入了变式主宰标准(VDC),以学习具有支配地位限制的概率衡量标准(VDC),该标准将所学措施与已知基线之间的预期随机顺序编码。我们分析了数量,并表明它们遭受了维度诅咒,并通过输入二次曲线最大值网络(ICMNs)提出具有准度的代孕率。最后,我们提供了一个微量框架,用于学习软质指令,并在合成和高维度图像生成上进行实验性验证,并得出有希望的结果。该代码可在 https://github.com/yair-schiff/stochastic-ordes-ICMMN上查阅。该代码可在 https://github.com/yair-schiff/scast-chastic-ords-ICMNMN。