Information measures can be constructed from R\'enyi divergences much like mutual information from Kullback-Leibler divergence. One such information measure is known as Sibson's $\alpha$-mutual information and has received renewed attention recently in several contexts: concentration of measure under dependence, statistical learning, hypothesis testing, and estimation theory. In this paper, we survey and extend the state of the art. In particular, we introduce variational representations for Sibson's $\alpha$-mutual information and employ them in each of the contexts just described to derive novel results. Namely, we produce generalized Transportation-Cost inequalities and Fano-type inequalities. We also present an overview of known applications, spanning from learning theory and Bayesian risk to universal prediction.
翻译:暂无翻译