The Jeffreys divergence is a renown symmetrization of the statistical Kullback-Leibler divergence which is often used in statistics, machine learning, signal processing, and information sciences in general. Since the Jeffreys divergence between the ubiquitous Gaussian Mixture Models are not available in closed-form, many techniques with various pros and cons have been proposed in the literature to either (i) estimate, (ii) approximate, or (iii) lower and/or upper bound this divergence. In this work, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two univariate GMMs of arbitrary number of components. The heuristic relies on converting GMMs into pairs of dually parameterized probability densities belonging to exponential families. In particular, we consider Exponential-Polynomial Densities, and design a goodness-of-fit criterion to measure the dissimilarity between a GMM and a EPD which is a generalization of the Hyv\"arinen divergence. This criterion allows one to select the orders of the EPDs to approximate the GMMs. We demonstrate experimentally that the computational time of our heuristic improves over the stochastic Monte Carlo estimation baseline by several orders of magnitude while approximating reasonably well the Jeffreys divergence, specially when the univariate mixtures have a small number of modes.
翻译:Jeffrey的偏差是统计 Kullback- Leiber 的偏差,在统计、机器学习、信号处理和一般信息科学中经常使用。由于封闭式模型中不存在无处不在的Gaussian Mixture模型之间的偏差,文献中提出了许多具有各种利弊的技术,以便:(一) 估计,(二) 估计,(三) 近似,或(三) 下限和(或)高限这一差异。在这项工作中,我们建议简单而快速地杂乱地接近两个单向 GMMMmmmms之间任意数量之差。由于超常使用将GMMms转换成属于指数型家族的双倍参数性概率密度,因此,文献中提出了许多具有各种利弊的技术和适当标准,以衡量GMM和EPD之间的不相容性差异,这是Hyv\"arinen 差异的概括性。这一标准允许人们选择GMMs(EPD) 的不合理性差幅值,同时选择了EPDMMs mustimalalalals amaltoimalalals to exal to shillings to shillations to shillagementaldrolationaldals to shaldaldaldaldaldaldaldaldaldaldaldalds byss ex ex ex ex ex ex)。