The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian distributions is not available in closed-form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using generalized statistical mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family, and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. We also define generalized Jensen-Shannon divergences between matrices (e.g., quantum Jensen-Shannon divergences) and consider clustering with respect to these novel Jensen-Shannon divergences.
翻译:Jensen-Shannon(JS) 差异是一个已知的、有界限的库尔背-利博尔差异的对称,它测量了库尔背-利博尔差异与平均混合分布之间的总体差异。然而,Gaussian分布之间的Jensen-Shannon差异没有以封闭形式提供。为了绕过这个问题,我们用抽象手段概括了Jensen-Shannon(JS)的差异,在根据分布的对称组合选择平均值时,这种差异产生封闭式表达。更一般地说,我们用抽象方式得出的普遍统计混合物来界定JSJS的偏差。特别是,我们首先显示,几何平均值对于指数式组合是完全适合的,报告(一) 同一指数型组合的概率差之间的几何式Jensen-Shannon差异,以及(二) 逆向库尔背-利博尔差异的对 JS-JS-S-Symetrmall。作为第二个说明的例子,我们展示的是, 以正调平均值平均值表示,对于正统-Sental-real-Sqental-Real-Smal-Slal-Sl) 分布的分布也定义。