We investigate the problem of representing information measures in terms of the moments of the underlying random variables. First, we derive polynomial approximations of the conditional expectation operator. We then apply these approximations to bound the best mean-square error achieved by a polynomial estimator -- referred to here as the PMMSE. In Gaussian channels, the PMMSE coincides with the minimum mean-square error (MMSE) if and only if the input is either Gaussian or constant, i.e., if and only if the conditional expectation of the input of the channel given the output is a polynomial of degree at most 1. By combining the PMMSE with the I-MMSE relationship, we derive new formulas for information measures (e.g., differential entropy, mutual information) that are given in terms of the moments of the underlying random variables. As an application, we introduce estimators for information measures from data via approximating the moments in our formulas by sample moments. These estimators are shown to be asymptotically consistent and possess desirable properties, e.g., invariance to affine transformations when used to estimate mutual information.
翻译:我们根据潜在随机变量的瞬间来调查信息测量问题。 首先, 我们从有条件的预期运算符中得出多数值近似值, 然后将这些近近似值用来约束多数值估测器(这里称为 PMMSE ) 所实现的最佳平均偏差。 在高西亚频道中, PMMSE 与最小平均差差( MMSE) 相吻合, 如果输入是高斯或常数, 也就是说, 只有有条件的对频道输入的预期值在最大程度上是多数值的时段。 通过将 PMMSE 和 I- MMSE 关系结合起来, 我们为信息测量( 例如, 差数、 相互信息) 制定了新的公式, 以基本随机变量的时段为单位。 作为应用, 我们引入了信息测量器, 通过匹配我们公式中的时的时段来测量信息量。 这些估测仪显示, 当相互变化时, 使用顺差的时, 将显示是随机的, 并掌握着理想的特性。