Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine learning. To combat the curse of dimensionality when estimating these distances from data, recent work has proposed smoothing out local irregularities in the measured distributions via convolution with a Gaussian kernel. Motivated by the scalability of this framework to high dimensions, we investigate the structural and statistical behavior of the Gaussian-smoothed $p$-Wasserstein distance $\mathsf{W}_p^{(\sigma)}$, for arbitrary $p\geq 1$. After establishing basic metric and topological properties of $\mathsf{W}_p^{(\sigma)}$, we explore the asymptotic statistical behavior of $\mathsf{W}_p^{(\sigma)}(\hat{\mu}_n,\mu)$, where $\hat{\mu}_n$ is the empirical distribution of $n$ independent observations from $\mu$. We prove that $\mathsf{W}_p^{(\sigma)}$ enjoys a parametric empirical convergence rate of $n^{-1/2}$, which contrasts the $n^{-1/d}$ rate for unsmoothed $\mathsf{W}_p$ when $d \geq 3$. Our proof relies on controlling $\mathsf{W}_p^{(\sigma)}$ by a $p$th-order smooth Sobolev distance $\mathsf{d}_p^{(\sigma)}$ and deriving the limit distribution of $\sqrt{n}\,\mathsf{d}_p^{(\sigma)}(\hat{\mu}_n,\mu)$, for all dimensions $d$. As applications, we provide asymptotic guarantees for two-sample testing and minimum distance estimation using $\mathsf{W}_p^{(\sigma)}$, with experiments for $p=2$ using a maximum mean discrepancy formulation of $\mathsf{d}_2^{(\sigma)}$.
翻译:概率分布之间的偏差度度( 通常称为统计距离 ) 在概率理论、 统计和机器学习中是无处不在的 。 为了在估算离数据距离时克服维度的诅咒, 最近的工作建议通过与高斯内核的混和来消除测量分布中的局部异常。 我们受此框架可缩放到高斯内核的偏移性驱动, 我们调查高斯- smooth 的结构性和统计行为 $( hat_ p$- Wasserstein 距离 $\ math) (W\\\\ p} math 美元 美元 。 在建立 $\ gemfs\\ gma} (W\\\\\\ gma) 美元的基本度和表情性属性特性之后, 美元内基价( hat\ f%) 和美元内基内基内基内基内基值( $) 美元内基内基值的美元独立观测结果分布 。