Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference. In each setting, these kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or even (ii) control weak convergence to P. In this article we derive new sufficient and necessary conditions to ensure (i) and (ii). For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels and for controlling convergence with bounded kernels. We use these results on $\mathbb{R}^d$ to substantially broaden the known conditions for KSD separation and convergence control and to develop the first KSDs known to exactly metrize weak convergence to P. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.
翻译:最大平均值差异(MMDs),如内核斯坦因量差异(KSD)等最大平均值差异(MMDs),对多种应用,包括假设测试、取样员选择、分布近似和变异推导,都变得十分关键,在每种环境下,都需要以内核为基础的差异措施:(一) 将目标P与其他概率措施或甚至(二) 控制与P的趋同不力。在本篇文章中,我们产生了新的足够和必要的条件,以确保(一)和(二) 。对于关于可分离的计量空间的MMDs,我们把那些分离出Bochner内嵌入措施的内核作为特征,并引入简单的条件,将所有措施与无限制的内核分离,并控制与捆绑的内核的趋同。我们用这些结果用$mathbb{R ⁇ d$大幅度扩大已知的KSD分离和趋同控制条件,并开发第一个已知的KSDsds,以准确满足与P的弱趋同性趋同。与此同时,我们强调我们的结果对假设测试、测量和改进样品质量的影响,以及与斯坦变梯梯下取样的影响。