Learning high-dimensional distributions is often done with explicit likelihood modeling or implicit modeling via minimizing integral probability metrics (IPMs). In this paper, we expand this learning paradigm to stochastic orders, namely, the convex or Choquet order between probability measures. Towards this end, exploiting the relation between convex orders and optimal transport, we introduce the Choquet-Toland distance between probability measures, that can be used as a drop-in replacement for IPMs. We also introduce the Variational Dominance Criterion (VDC) to learn probability measures with dominance constraints, that encode the desired stochastic order between the learned measure and a known baseline. We analyze both quantities and show that they suffer from the curse of dimensionality and propose surrogates via input convex maxout networks (ICMNs), that enjoy parametric rates. We provide a min-max framework for learning with stochastic orders and validate it experimentally on synthetic and high-dimensional image generation, with promising results. Finally, our ICMNs class of convex functions and its derived Rademacher Complexity are of independent interest beyond their application in convex orders.
翻译:在本文中,我们将这种学习范式扩展为随机命令,即概率计量之间的二次曲线或Choquet顺序。为此,我们利用二次曲线订单和最佳运输之间的关系,引入了Choquet-Toland概率测量方法之间的距离,可用作IPMs的低位替代物。我们还引入了挥发式定标值(VDC),以了解受支配地位制约的概率测量方法,将所学计量尺度和已知基线之间的预期切换顺序编码。我们分析了数量,并表明它们受到维度诅咒的影响,并通过输入式峰值网络(ICMNs)提出具有准度的代孕率。我们提供了一个微模框架,用于学习随机排序,并在合成和高维度图像生成上进行实验性验证,并取得有希望的结果。最后,我们ICMNC的相位函数类别及其衍生的拉德马克斯复杂度是超出其应用范围的独立利益范围。