Distributional approximation is a fundamental problem in machine learning with numerous applications across all fields of science and engineering and beyond. The key challenge in most approximation methods is the need to tackle the intractable normalization constant pertaining to the parametrized distributions used to model the data. In this paper, we present a novel Stein operator on Lie groups leading to a kernel Stein discrepancy (KSD) which is a normalization-free loss function. We present several theoretical results characterizing the properties of this new KSD on Lie groups and its minimizers namely, the minimum KSD estimator (MKSDE). Proof of several properties of MKSDE are presented, including strong consistency, CLT and a closed form of the MKSDE for the von Mises-Fisher distribution on SO(N). Finally, we present experimental evidence depicting advantages of minimizing KSD over maximum likelihood estimation.
翻译:暂无翻译