The quadratic decaying property of the information rate function states that given a fixed conditional distribution $p_{\mathsf{Y}|\mathsf{X}}$, the mutual information between the (finite) discrete random variables $\mathsf{X}$ and $\mathsf{Y}$ decreases at least quadratically in the Euclidean distance as $p_\mathsf{X}$ moves away from the capacity-achieving input distributions. It is a property of the information rate function that is particularly useful in the study of higher order asymptotics and finite blocklength information theory, where it was already implicitly used by Strassen [1] and later, more explicitly, by Polyanskiy-Poor-Verd\'u [2]. However, the proofs outlined in both works contain gaps that are nontrivial to close. This comment provides an alternative, complete proof of this property.
翻译:暂无翻译