The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the conditional mean in the case of finite-variance input signals and additive Gaussian noise. The main result is a new lower bound in terms of the differential entropies of the input signal and the noisy observation. The main results are also extended to the vector Gaussian channel and to the natural exponential family. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. Two applications of the lower bound in the remote-source coding and CEO problem are discussed.
翻译:有条件的平均值是基本和重要数量,其应用包括估算和率扭曲的理论,也难于与之合作。本文对有限变量输入信号和加固高斯噪音情况下的有条件平均值的微小差别设定了新的界限。主要结果是输入信号和噪音观测的微小差别含量增加了一个新的下限。主要结果还扩展到了矢量高西亚频道和自然指数大家庭。还获得了其他各种特性,如上界、无序、泰勒系列扩展和渔业信息连接。讨论了远程源码和CEO问题中下界的两种应用。