We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized $\alpha$-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.
翻译:我们严格量化了不同组别分布差异估计样本复杂性的改进。在瓦西斯坦-1度和利普西茨常规化的美元(alpha$)比值中,样本复杂性的降低与该组规模的左右环境的能量成正比。对于最大平均差异(MMD)而言,样本复杂性的提高更为细微,因为它不仅取决于群体规模,而且取决于核心的选择。数字模拟证实了我们的理论。