Kernel Stein discrepancy (KSD) is a widely used kernel-based non-parametric measure of discrepancy between probability measures. It is often employed in the scenario where a user has a collection of samples from a candidate probability measure and wishes to compare them against a specified target probability measure. A useful property of KSD is that it may be calculated with samples from only the candidate measure and without knowledge of the normalising constant of the target measure. KSD has been employed in a range of settings including goodness-of-fit testing, parametric inference, MCMC output assessment and generative modelling. Two main issues with current KSD methodology are (i) the lack of applicability beyond the finite dimensional Euclidean setting and (ii) a lack of clarity on what influences KSD performance. This paper provides a novel spectral representation of KSD which remedies both of these, making KSD applicable to Hilbert-valued data and revealing the impact of kernel and Stein operator choice on the KSD. We demonstrate the efficacy of the proposed methodology by performing goodness-of-fit tests for various Gaussian and non-Gaussian functional models in a number of synthetic data experiments.
翻译:KSD的有用属性是,它可以用仅来自候选测量的样本来计算,而没有目标测量的正常常数。 KSD被用于一系列环境,包括适当测试、参数推断、MCMC输出评估和基因化建模。目前KSD方法的两个主要问题是:(一) 无法在有限维度Euclidean设置之外进行适用性测试,以及(二) 不清楚KSD的性能影响。本文提供了KSD的新型光谱代表,这两种补救方法都使KSD适用于Hilbert的有价值数据,并揭示了内核和Stein经营者选择对KSD的影响。我们通过在一系列合成数据实验中为各种高斯和非Gaussian功能模型进行最佳测试,显示了拟议方法的功效。