Quantifying the deviation of a probability distribution is challenging when the target distribution is defined by a density with an intractable normalizing constant. The kernel Stein discrepancy (KSD) was proposed to address this problem and has been applied to various tasks including diagnosing approximate MCMC samplers and goodness-of-fit testing for unnormalized statistical models. This article investigates a convergence control property of the diffusion kernel Stein discrepancy (DKSD), an instance of the KSD proposed by Barp et al. (2019). We extend the result of Gorham and Mackey (2017), which showed that the KSD controls the bounded-Lipschitz metric, to functions of polynomial growth. Specifically, we prove that the DKSD controls the integral probability metric defined by a class of pseudo-Lipschitz functions, a polynomial generalization of Lipschitz functions. We also provide practical sufficient conditions on the reproducing kernel for the stated property to hold. In particular, we show that the DKSD detects non-convergence in moments with an appropriate kernel.
翻译:当目标分布由密度和难以调和的正常常数来界定时,测量概率分布偏差就具有挑战性。内核斯坦质差异(KSD)是为了解决这一问题,并应用于各种任务,包括诊断近似MCMC取样器和对未经标准化的统计模型进行良好测试。本篇文章调查了扩散内核差异(DKSD)的趋同控制属性,这是Barp等人(2019年)提议的KSD(2019年)的一个实例。我们扩大了Gorham和Mackey(2017年)的结果,这表明KSD控制了受约束的Lipschitz测量仪,以控制多子生长的功能。具体地说,我们证明DKSD控制了由伪利普施奇茨功能类别界定的综合概率参数,即利普施奇茨功能的多元统称。我们还为上述属性的再生产内核提供了足够的实际条件。我们显示,DKSD在与适当的内核内核时检测非连接性。