This paper concerns the convergence of empirical measures in high dimensions. We propose a new class of metrics and show that under such metrics, the convergence is free of the curse of dimensionality (CoD). Such a feature is critical for high-dimensional analysis and stands in contrast to classical metrics ({\it e.g.}, the Wasserstein distance). The proposed metrics originate from the maximum mean discrepancy, which we generalize by proposing specific criteria for selecting test function spaces to guarantee the property of being free of CoD. Therefore, we call this class of metrics the generalized maximum mean discrepancy (GMMD). Examples of the selected test function spaces include the reproducing kernel Hilbert space, Barron space, and flow-induced function spaces. Three applications of the proposed metrics are presented: 1. The convergence of empirical measure in the case of random variables; 2. The convergence of $n$-particle system to the solution to McKean-Vlasov stochastic differential equation; 3. The construction of an $\varepsilon$-Nash equilibrium for a homogeneous $n$-player game by its mean-field limit. As a byproduct, we prove that, given a distribution close to the target distribution measured by GMMD and a certain representation of the target distribution, we can generate a distribution close to the target one in terms of the Wasserstein distance and relative entropy. Overall, we show that the proposed class of metrics is a powerful tool to analyze the convergence of empirical measures in high dimensions without CoD.
翻译:本文涉及高层面经验措施的趋同性。 我们建议了一个新的衡量标准类别, 并表明, 在这类衡量标准下, 趋同性是不受维度诅咒的( CoD ) 。 这种特征对于高维度分析至关重要, 并与传统的衡量标准( 例如, 瓦塞尔斯坦距离 ) 形成对照。 拟议的衡量标准来自最大平均差异, 我们通过提出具体标准来选择测试功能空间, 以保证没有COD 的属性。 因此, 我们称这一类衡量标准为通用的最大平均差异( GMD ) 。 所选测试功能空间的例子包括再生产内尔伯特空间、 Barron空间和流动引发的功能空间。 所拟议的衡量标准有三种应用: 1. 随机变量情况下的经验性衡量标准趋同性; 2. 美元-粒子系统与Mckecan-Vlasov 随机差异方程式解决方案的趋同性。 因此, 我们称, 美元-Nash 平衡, 以不以近一美元平方平方平价的游戏形式,, 以其平均平方平方平方平方平方平方平方平方平方平方平方平方平的比的计算, 比例, 显示我们所测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测测