The Jeffreys divergence is a renown symmetrization of the statistical Kullback-Leibler divergence which is often used in machine learning, signal processing, and information sciences. Since the Jeffreys divergence between the ubiquitous Gaussian Mixture Models are not available in closed-form, many techniques with various pros and cons have been proposed in the literature to either (i) estimate, (ii) approximate, or (iii) lower and upper bound this divergence. In this work, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two GMMs of arbitrary number of components. The heuristic relies on converting GMMs into pairs of dually parameterized probability densities belonging to exponential families. In particular, we consider Polynomial Exponential Densities, and design a goodness-of-fit criterion to measure the dissimilarity between a GMM and a PED which is a generalization of the Hyv\"arinen divergence. This criterion allows one to select the orders of the PEDs to approximate the GMMs. We demonstrate experimentally that the computational time of our heuristic improves over the stochastic Monte Carlo estimation baseline by several orders of magnitude while approximating reasonably well the Jeffreys divergence, specially when the univariate mixtures have a small number of modes.
翻译:Jeffrey的偏差是众所周知的Kullback-Leiber统计差异的对称,在机器学习、信号处理和信息科学中经常使用这种差异。由于在封闭式模型中不存在无处不在的Gaussian Mixture模型之间出现差异,文献中提出了许多具有各种利弊的技术,以便:(一) 估计,(二) 估计,(三) 近似,或(三) 测出这种差异的大小和上限。在这项工作中,我们提出一种简单而又快速的偏差,以近于两个任意数组件的GMMMMs之间的差异。超常依赖将GMMs转换成属于指数型家族的双对双参数概率概率密度。特别是,我们考虑了多种多元性博度密度技术,并在设计一个良好的标准来衡量GMM和PED之间的不相像性差异,这是Hyv\"arinen 差异的概括性。这一标准允许一个人选择PED的顺序来接近GMS。我们从几度模型中可以实验性地改进GMMMs。我们几度模型的计算模型的模型的模型的模型的精确度,我们通过实验性地展示了几度的模型的模型的模型的模型的精确度。我们通过实验性调整了几度的模型的模型的精确度。