We present a simple method to approximate Rao's distance between multivariate normal distributions based on discretizing curves joining normal distributions and approximating Rao's distances between successive nearby normal distributions on the curves by the square root of Jeffreys divergence, the symmetrized Kullback-Leibler divergence. We consider experimentally the linear interpolation curves in the ordinary, natural and expectation parameterizations of the normal distributions, and compare these curves with a curve derived from the Calvo and Oller's isometric embedding of the Fisher-Rao $d$-variate normal manifold into the cone of $(d+1)\times (d+1)$ symmetric positive-definite matrices [Journal of multivariate analysis 35.2 (1990): 223-242]. We report on our experiments and assess the quality of our approximation technique by comparing the numerical approximations with both lower and upper bounds. Finally, we present several information-geometric properties of the Calvo and Oller's isometric embedding.
翻译:translated abstract:
本文介绍一种简单的方法,基于离散曲线将正态分布连接起来,并使用杰弗里斯差异的平方根(对称化Kullback-Leibler差异)来近似曲线上相邻的近似Rao距离。我们在正常、自然和期望参数化的线性插值曲线中进行实验,并将这些曲线与通过将Fisher-Rao $d$-variate正态流形等距嵌入到$(d+1)\times(d+1)$对称正定矩阵的锥体中的Calvo和Oller曲线进行比较[J. Multivariate分析35.2(1990):223-242]。我们通过将数值逼近与下限和上限进行比较来评估我们逼近技术的质量。最后,我们提供了Calvo和Oller等距嵌入的几个信息几何性质。