We provide a mutual information lower bound that can be used to analyze the effect of training in models with unknown parameters. For large-scale systems, we show that this bound can be calculated using the difference between two derivatives of a conditional entropy function. The bound does not require explicit estimation of the unknown parameters. We provide a step-by-step process for computing the bound, and provide an example application. A comparison with known classical mutual information bounds is provided.
翻译:我们提供了一个相互信息的下限,可用于分析在参数不明的模型中培训的效果。对于大型系统,我们证明这一约束可以使用有条件的对流函数的两个衍生物之间的差别来计算。该约束不需要对未知参数进行明确的估计。我们提供了一个计算约束的逐步过程,并提供了一个实例应用。我们提供了与已知的经典相互信息界限的比较。