Let $P_Z$ be a given distribution on $\mathbb{R}^n$. For any $y\in\mathbb{R}^n$, we may interpret $\rho(y):=\ln\mathbb{E}[e^{\left<y,Z\right>}]$ as a soft-max of $\left<y,Z\right>$. We explore lower bounds on $\mathbb{E}[\rho(Y)]$ in terms of the minimum mutual information $I(Z,\bar{Z})$ over $P_{Z\bar{Z}}$ which is a coupling of $P_Z$ and itself such that $Z-\bar{Z}$ is bounded in a certain sense. This may be viewed as a soft version of Sudakov's minoration, which lower bounds the expected supremum of a stochastic process in terms of the packing number. Our method is based on convex geometry (thrifty approximation of convex bodies), and works for general non-Gaussian $Y$. When $Y$ is Gaussian and $\bar{Z}$ converges to $Z$, this recovers a recent inequality of Bai-Wu-Ozgur on information-constrained optimal transport, previously established using Gaussian-specific techniques. We also use soft-minoration to obtain asymptotically (in tensor order) tight bounds on the free energy in the Sherrington-Kirkpatrick model with spins uniformly distributed on a type class, implying asymptotically tight bounds for the type~II error exponent in spiked tensor detection.
翻译:暂无翻译