We consider the space of $w$-mixtures which is defined as the set of finite statistical mixtures sharing the same prescribed component distributions closed under convex combinations. The information geometry induced by the Bregman generator set to the Shannon negentropy on this space yields a dually flat space called the mixture family manifold. We show how the Kullback-Leibler (KL) divergence can be recovered from the corresponding Bregman divergence for the negentropy generator: That is, the KL divergence between two $w$-mixtures amounts to a Bregman Divergence (BD) induced by the Shannon negentropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same Gaussian components is equivalent to a Bregman divergence. This KL-BD equivalence on a mixture family manifold implies that we can perform optimal KL-averaging aggregation of $w$-mixtures without information loss. More generally, we prove that the statistical skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a skew Jensen divergence between their corresponding parameters. Finally, we state several properties, divergence identities, and inequalities relating to $w$-mixtures.
翻译:我们认为,“千元混凝土”的空间是一定的统计混合物的空间,它的定义是,在混凝土组合下,分享相同规定成分分布的一组固定的混合物。Bregman 发电机在这个空间上对香农内质质裁量法产生的信息几何测量结果产生一个双平的空间,称为混合式组合体。我们展示了Kullback-Leibel (KL) 的差异如何从相应的红色生成器的Bregman差异中恢复过来:也就是说,两个千元混合体之间的KL差异相当于香农内质裁剪动发电机引起的Bregman differgence(BD)。因此,两个高斯文混集模型(GMMS)之间共享相同的双平方空间差异相当于Bregman 组合体差异。这种对混合式组合体的KL-BD等值意味着,我们可以在不丢失信息的情况下进行最佳的KL-levelageging $-mix($-Sentrodulex) 之间的统计正值差异与美元-Sent-Shanleveloplex(美元) 和美元-gleglegs) 等值差异。