The Chernoff information between two probability measures is a statistical divergence measuring their distinguishability defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally defined for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications ranging from information fusion to quantum information theory. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minimax symmetrization of the Kullback--Leibler divergence related to the capacity of a discrete memoryless channel. In this paper, we first revisit the Chernoff information by considering the exponential families induced by geometric mixtures: The likelihood ratio exponential families. Second, we show how to (i) calculate using symbolic computing a closed-form formula for the Chernoff information between any two univariate Gaussian distributions and (ii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions. We also report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices.
翻译:两种概率计量方法之间的 Chernoff 信息是一种统计差异,测量其可辨别性,定义为最大偏斜的 Bhattacharyya 距离。 虽然 Chernoff 信息最初被定义为在统计假设测试中将Bayes错误捆绑起来, 但差异还发现了许多其他应用, 从信息聚合到量信息理论。 从信息理论的角度来看, Chernoff 信息也可以被解释为与离散的无内存频道能力有关的Kullback- Leiber 差异的微量成像。 在本文中,我们首先通过考虑几何混合物引发的指数家庭: 概率比率指数家庭来重新审视 Chernoff 信息。 其次,我们展示了如何 (一) 使用符号计算切诺夫 信息在任何两个单项高斯分布之间的封闭式公式, 以及 (二) 使用快速数字方法来估计任何两个多变量高斯分布之间的 Chernoff 信息。 我们还报告了一个封闭式公式, 以缩放式的COVI为中心高斯的基质。