The families of $f$-divergences (e.g. the Kullback-Leibler divergence) and Integral Probability Metrics (e.g. total variation distance or maximum mean discrepancies) are widely used to quantify the similarity between probability distributions. In this work, we systematically study the relationship between these two families from the perspective of convex duality. Starting from a tight variational representation of the $f$-divergence, we derive a generalization of the moment-generating function, which we show exactly characterizes the best lower bound of the $f$-divergence as a function of a given IPM. Using this characterization, we obtain new bounds while also recovering in a unified manner well-known results, such as Hoeffding's lemma, Pinsker's inequality and its extension to subgaussian functions, and the Hammersley-Chapman-Robbins bound. This characterization also allows us to prove new results on topological properties of the divergence which may be of independent interest.
翻译:在这项工作中,我们从共产主义双重性的角度系统地研究这两个家庭之间的关系。我们从美元-共产物的紧密差异中,得出了对时间生成功能的概括,我们从中可以看出,美元-共产物的最佳下限特征,即:作为特定IPP的函数,我们表现出美元-共产物的最佳下限。使用这种特征,我们获得了新的界限,同时以统一的方式恢复了已知的结果,如Hoffding's Lemma, Pinsker的不平等及其延伸至subaubusian的功能,以及Hammersley-Capman-Robbins。这种特征还使我们能够证明可能具有独立利益的差异的表面特性的新结果。