We generalize the Jensen-Shannon divergence by considering a variational definition with respect to a generic mean extending thereby the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields another way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed probability measure families, we get relative Jensen-Shannon divergences and symmetrizations which generalize the concept of information projections. Finally, we discuss applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures including statistical mixtures.
翻译:我们通过考虑一个通用平均值的变式定义,从而扩大Sibson信息半径的概念,从而将Jensen-Shannon差异加以概括化。变式定义适用于任何任意的距离,并产生另一种方法来定义Jensen-Shannon的距离对称。当变式优化进一步限制为属于规定概率计量家庭时,我们得到相对的Jensen-Shannel差异和对称,它概括了信息预测的概念。最后,我们讨论了这些变式Jensen-Shannon差异和多样性指数在包括统计混合物在内的概率计量组合和量化任务中的应用。