An important feature of kernel mean embeddings (KME) is that the rate of convergence of the empirical KME to the true distribution KME can be bounded independently of the dimension of the space, properties of the distribution and smoothness features of the kernel. We show how to speed-up convergence by leveraging variance information in the RKHS. Furthermore, we show that even when such information is a priori unknown, we can efficiently estimate it from the data, recovering the desiderata of a distribution agnostic bound that enjoys acceleration in fortuitous settings. We illustrate our methods in the context of hypothesis testing and robust parametric estimation.
翻译:内核平均嵌入(KME)的一个重要特征是,经验型KME与真正分布的趋同速度可以与内核的空间、分布特性和光滑性特点分开,我们通过利用RKHS中的差异信息来显示如何加快趋同速度,此外,我们表明,即使这种信息是先验未知的,我们也可以从数据中有效地估计,恢复在不相干的环境中加速的分布型不可知界限的分界。我们用假设测试和强力参数估计来说明我们的方法。