There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in the absence of any assumptions on the structure or independencies within these distributions, computing the divergence between them is an intractable problem in high dimensions. We show that we are able to compute a wide family of functionals and divergences, such as the alpha-beta divergence, between two decomposable models, i.e. chordal Markov networks, in time exponential to the treewidth of these models. The alpha-beta divergence is a family of divergences that include popular divergences such as the Kullback-Leibler divergence, the Hellinger distance, and the chi-squared divergence. Thus, we can accurately compute the exact values of any of this broad class of divergences to the extent to which we can accurately model the two distributions using decomposable models.
翻译:计算两种离散概率衡量方法之间的准确差异,包括机器学习。 不幸的是,在对分布中的结构或依赖性没有任何假设的情况下,计算它们之间的差异在高维方面是一个棘手的问题。 我们表明,我们能够计算出两大类功能和差异,如两个可分解模型(即chordal Markov 网络)之间的函数和差异,即甲型贝塔网络,在时间指数到这些模型的树枝之间。 甲型贝塔差异是一个差异组合,包括流行差异,如Kullback-Leibeller差异、Hellinger距离和奇方差。 因此,我们可以准确地计算出这一大类差异的准确值,以便我们用可分解模型准确模拟两种分布。