In this paper, we introduce super-modular $\mf$-divergences and provide three applications for them: (i) we introduce Sanov's upper bound on the tail probability of sum of independent random variables based on super-modular $\mf$-divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual $\mf$-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular $\mf$-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual $\mf$-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds. Moreover, super-modular $\mf$-divergences are utilized to reduce the dimension of the problem and obtain single-letter bounds.
翻译:在本文中,我们引入了超级模块 $\ mf$- divegences, 并为它们提供了三种应用:(一) 我们引入了Sanov对基于超级模块 $\ mf$- divegences的独立随机变量的尾端概率的上限的上限,并表明我们普遍的 Sanov 约束性算法的通用错误与普通的一致有严格的改善;(二) 我们考虑的是为特定扭曲和代码长度研究一套可实现的费率的亏损压缩问题。我们使用相互的 $\ mf$- divegences 扩展了减速函数的扩展功能,并提供了在有限区长制度中使用超级模块 $\ mf$- diverences 的可实现利率的新的和严格的限制值的新的和严格的限制值的新的下限值。