Distributional comparison is a fundamental problem in statistical data analysis with numerous applications in a variety of fields including but not limited to Sciences and Engineering. Numerous methods exist for distributional comparison but Kernel Stein Discrepancy (KSD) has gained significant popularity in recent times. In this paper, we present a novel mathematically rigorous and consistent generalization of KSD to Riemannian manifolds. We first generalize the Stein's operator to Riemannian manifolds and use it to establish the Stein's Lemma on Riemannian manifolds. Then we define a novel Stein class and use it to develop what we call qualified kernels that are used to define the KSD in closed form on Rimeannian manifolds. We present several examples of our theory applied to commonly encountered Riemannian manifolds in applications namely, the Riemannian homogeneous spaces, for example, the n-sphere, the Grassmanian, Stiefel, the manifold of symmetric positive definite matrices and others. On these aforementioned manifolds, we consider a variety of distributions with intractable normalization constants and derive closed form expressions for the KSD and the minimum KSD estimator (mKSDE). Several theoretical properties of mKSDE are established and we present results of comparison between mKSDE and MLE on a widely popular example, the sphere.
翻译:暂无翻译