Quantifying uncertainty in predictions or, more generally, estimating the posterior conditional distribution, is a core challenge in machine learning and statistics. We introduce Convex Nonparanormal Regression (CNR), a conditional nonparanormal approach for coping with this task. CNR involves a convex optimization of a posterior defined via a rich dictionary of pre-defined non linear transformations on Gaussians. It can fit an arbitrary conditional distribution, including multimodal and non-symmetric posteriors. For the special but powerful case of a piecewise linear dictionary, we provide a closed form of the posterior mean which can be used for point-wise predictions. Finally, we demonstrate the advantages of CNR over classical competitors using synthetic and real world data.
翻译:量化预测中的不确定性,或更一般地估计附带条件的分布,是机器学习和统计方面的一项核心挑战。我们引入了Convex 非重复性倒退(CNR),这是应对这项任务的一种有条件的、非异常的方法。CNR涉及通过高斯人预先定义的非线性变换的丰富字典界定的后遗症的混凝土优化。它可以适用于任意的有条件分布,包括多式联运和非对称后遗症。对于单数线字典这一特别但有力的案例,我们提供了一种封闭式的后遗症平均值,可用来进行有节可循的预测。最后,我们展示了CNR相对于使用合成和真实世界数据的传统竞争者的优势。