Let $\nu$ and $\mu$ be probability distributions on $\mathbb{R}^n$, and $\nu_s,\mu_s$ be their evolution under the heat flow, that is, the probability distributions resulting from convolving their density with the density of an isotropic Gaussian random vector with variance $s$ in each entry. This paper studies the rate of decay of $s\mapsto D(\nu_s\|\mu_s)$ for various divergences, including the $\chi^2$ and Kullback-Leibler (KL) divergences. We prove upper and lower bounds on the strong data-processing inequality (SDPI) coefficients corresponding to the source $\mu$ and the Gaussian channel. We also prove generalizations of de Brujin's identity, and Costa's result on the concavity in $s$ of the differential entropy of $\nu_s$. As a byproduct of our analysis, we obtain new lower bounds on the mutual information between $X$ and $Y=X+\sqrt{s} Z$, where $Z$ is a standard Gaussian vector in $\mathbb{R}^n$, independent of $X$, and on the minimum mean-square error (MMSE) in estimating $X$ from $Y$, in terms of the Poincar\'e constant of $X$.
翻译:暂无翻译