In this paper, we provide three applications for $f$-divergences: (i) we introduce Sanov's upper bound on the tail probability of the sum of independent random variables based on super-modular $f$-divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual $f$-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular $f$-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual $f$-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the $f$-rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds.
翻译:在本文中,我们提供了三种关于美元波动的应用程序:(一) 我们引入了Sanov对基于超级模块美元波动的独立的随机变量总和的尾端概率的上限,并表明我们普遍的Sanov的约束严格地改善了普通变量,(二) 我们考虑了损失压缩问题,对特定扭曲和代码长度的一套可实现的费率进行了研究。我们使用相互的美元信息扩展了率扭曲功能,对使用超级模块美元波动的有限轮廓体系中可实现的费率提供了新的和严格的更好约束,以及(三) 我们提供了一种联系,即与受约束的输入/输出相互的美元美元信息有关的算法的普遍错误和普遍的率扭曲问题。这种联系使我们得以将使用较低约束值的学习算法的普遍错误与美元利率率扭曲功能联系起来。我们的约束基于一种新的较低约束,即(一些例子)严格地改进了先前最著名的约束。