Motivated by the statistical and computational challenges of computing Wasserstein distances in high-dimensional contexts, machine learning researchers have defined modified Wasserstein distances based on computing distances between one-dimensional projections of the measures. Different choices of how to aggregate these projected distances (averaging, random sampling, maximizing) give rise to different distances, requiring different statistical analyses. We define the \emph{Sliced Wasserstein Process}, a stochastic process defined by the empirical Wasserstein distance between projections of empirical probability measures to all one-dimensional subspaces, and prove a uniform distributional limit theorem for this process. As a result, we obtain a unified framework in which to prove distributional limit results for all Wasserstein distances based on one-dimensional projections. We illustrate these results on a number of examples where no distributional limits were previously known.
翻译:在高维背景下计算瓦森斯坦距离的统计和计算挑战的推动下,机器学习研究人员根据单维预测量之间的计算距离,界定了改变瓦森斯坦距离。对于如何汇总这些预测距离的不同选择(稳定、随机抽样、最大化)导致不同距离,需要不同的统计分析。我们定义了“/emph{clied Wasserstein process } ”,这是一个由经验性瓦森斯坦在单维次空间对实验概率测量的预测之间的距离界定的随机过程,并证明这一过程的分布限制是统一的。结果,我们获得了一个统一框架,用以证明基于单维预测的所有瓦森斯丁距离的分布限制结果。我们用一些先前未知道分配限制的例子来说明这些结果。