The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback--Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.
翻译:两种概率测量方法之间的 Chernoff 信息是测量其偏差的统计差异, 定义是它们的最大偏差 Bhattacharyya 距离。 虽然 Chernoff 信息最初是在统计假设测试中为约束Bayes 错误而引入的, 但是由于在从信息聚合到量信息等各种应用中发现的经验稳健性属性, 差异还发现了许多其他应用。 从信息理论的角度来看, Chernoff 信息也可以被解释为 Kullback- Leibel 差异的细微平衡。 在本文中, 我们首先通过考虑测量其几何混合物引发的指数型家族: 所谓的概率率指数型家族, 来重新审视可测量的 Lebesgue 空间两个密度之间的 Chernoff 信息。 其次, 我们展示如何 (一) 精确地解决两个单项高斯分布之间的 Chernoff 信息, 或用符号计算获得封闭式公式 。 (二) 报告一个Chernoff 信息的封闭式公式, 由中心高斯公司使用缩放的缩缩略式矩阵和(三) 使用快速数字公式, 接近两个高斯之间的任何高列。