This paper provides a unified perspective for the Kullback-Leibler (KL)-divergence and the integral probability metrics (IPMs) from the perspective of maximum likelihood density-ratio estimation (DRE). Both the KL-divergence and the IPMs are widely used in various fields in applications such as generative modeling. However, a unified understanding of these concepts has still been unexplored. In this paper, we show that the KL-divergence and the IPMs can be represented as maximal likelihoods differing only by sampling schemes, and use this result to derive a unified form of the IPMs and a relaxed estimation method. To develop the estimation problem, we construct an unconstrained maximum likelihood estimator to perform DRE with a stratified sampling scheme. We further propose a novel class of probability divergences, called the Density Ratio Metrics (DRMs), that interpolates the KL-divergence and the IPMs. In addition to these findings, we also introduce some applications of the DRMs, such as DRE and generative adversarial networks. In experiments, we validate the effectiveness of our proposed methods.
翻译:本文从最大可能性密度和测算(DRE)的角度,为Kullback-Liberr(KL)-River(MIPs)和综合概率指标(IPMs)提供了一个统一的观点。KL-diverence(IPM)和IPM(IPM)在基因模型等应用领域广泛应用。然而,对这些概念的统一理解仍未得到探讨。在本文件中,我们表明KL-depack-Liberence(KL)和IPMs(IPM)只能通过抽样方案代表最大可能性,并利用这一结果得出综合药物和宽松的估计方法的统一形式。为了发展估算问题,我们用一个分层的采样方法构建了一个未受限制的最大可能性估计器,以便进行DRE(DR),我们进一步提出一个新的概率差异类别,称为DRM(DM)-Riverence(DMM)和IPMs(IPMs)。除了这些结果外,我们还引入了DRM(DRMs)的一些应用,例如DRE和Galizalizaltialdality network。