The minimum mean-square error (MMSE) achievable by optimal estimation of a random variable $Y\in\mathbb{R}$ given another random variable $X\in\mathbb{R}^{d}$ is of much interest in a variety of statistical settings. In the context of estimation-theoretic privacy, the MMSE has been proposed as an information leakage measure that captures the ability of an adversary in estimating $Y$ upon observing $X$. In this paper we establish provable lower bounds for the MMSE based on a two-layer neural network estimator of the MMSE and the Barron constant of an appropriate function of the conditional expectation of $Y$ given $X$. Furthermore, we derive a general upper bound for the Barron constant that, when $X\in\mathbb{R}$ is post-processed by the additive Gaussian mechanism and $Y$ is binary, produces order optimal estimates in the large noise regime. In order to obtain numerical lower bounds for the MMSE in some concrete applications, we introduce an efficient optimization process that approximates the value of the proposed neural network estimator. Overall, we provide an effective machinery to obtain provable lower bounds for the MMSE.
翻译:最小平均差值( MMSE ), 根据另一个随机变量 $X\ in\ mathbb{R}, 随机变量 $Y\ in\ mathb{R} 美元, 随机变量 $Y\ in\ mathb{R} 美元 的最佳估计值, 最小平均差值( MMSSE ), 在各种统计环境中引起了很大的兴趣。 在估算理论隐私方面, 提出了 MMSE, 最低差值( MMSSE ), 最低差值( MMSE ), 最低差值( MMSE ) 最佳估计值( MMSE ) 。 此外, 我们为 Barron 常值( Barron 常值) 设定了一个总体上限, 当 $X\ in\ mathb{R} 美元被添加剂加冕机制处理后, 美元为二元制, 在大型噪音系统中为 MMSE 设定了最优的测值。 为了在某些具体应用中为 MMSE 获得数字下限值, 我们引入一个高效优化网络,, 以近为我们测测低机 。