In many contexts Gaussian Mixtures (GM) are used to approximate probability distributions, possibly time-varying. In some applications the number of GM components exponentially increases over time, and reduction procedures are required to keep them reasonably limited. The GM reduction (GMR) problem can be formulated by choosing different measures of the dissimilarity of GMs before and after reduction, like the Kullback-Leibler Divergence (KLD) and the Integral Squared Error (ISE). Since in no case the solution is obtained in closed form, many approximate GMR algorithms have been proposed in the past three decades, although none of them provides optimality guarantees. In this work we discuss the importance of the choice of the dissimilarity measure and the issue of consistency of all steps of a reduction algorithm with the chosen measure. Indeed, most of the existing GMR algorithms are composed by several steps which are not consistent with a unique measure, and for this reason may produce reduced GMs far from optimality. In particular, the use of the KLD, of the ISE and normalized ISE is discussed and compared in this perspective.
翻译:在许多情况下,Gaussian Mixtures(GM)被用于估计概率分布,可能的时间差异。在某些应用中,GM组件的数量随着时间推移而成指数性增长,要求减少程序保持合理的限制。GMR的减少问题可以通过在减少之前和之后选择不同GM不同程度的不同衡量方法,例如Kullback-Lebel differgence(KLD)和综合平方错误(ISE)等。由于解决办法没有以封闭形式获得,因此在过去三十年中提出了许多近似GMR算法,尽管没有提供最佳性保证。在这项工作中,我们讨论了选择差异性计量方法的重要性以及削减算法所有步骤与所选择的计量方法是否一致的问题。事实上,现有的GMR算法大多由若干步骤组成,这些步骤与一种独特的计量方法不相符合,因此可能使GM远离最佳性。特别是KLD的使用IS和已正常化的ISE。