Kernel Stein discrepancy (KSD) is a widely used kernel-based measure of discrepancy between probability measures. It is often employed in the scenario where a user has a collection of samples from a candidate probability measure and wishes to compare them against a specified target probability measure. A useful property of KSD is that it may be calculated with samples from only the candidate measure and without knowledge of the normalising constant of the target measure. KSD has been employed in a range of settings including goodness-of-fit testing, parametric inference, MCMC output assessment and generative modelling. Two main issues with current KSD methodology are (i) the lack of applicability beyond the finite dimensional Euclidean setting and (ii) a lack of clarity on what influences KSD performance. This paper provides a novel spectral representation of KSD which remedies both of these, making KSD applicable to Hilbert-valued data and revealing the impact of kernel and Stein operator choice on the KSD. We demonstrate the efficacy of the proposed methodology by performing goodness-of-fit tests for various Gaussian and non-Gaussian functional models in a number of synthetic data experiments.
翻译:KSD的一个有用属性是,它可以用仅来自候选测量的样本来计算,而没有目标测量的正常常数。 KSD被用于一系列环境,包括适当测试、参数推断、MCMC输出评估和基因化建模。目前KSD方法的两个主要问题是:(一) 无法在有限维度的Euclidean设置之外适用,以及(二) 对KSD的绩效影响不够明确。本文提供了KSD的新的光谱表示,它既能纠正这些现象,又使KSD适用于Hilbert估值的数据,并揭示内核和施泰因操作员的选择对KSD的影响。我们通过对若干合成实验中的各种高斯和非Gaussian功能模型进行良好测试,证明了拟议方法的有效性。