The entropy power inequality for independent random vectors is a foundational result of information theory, with deep connections to probability and geometric functional analysis. Several extensions of the entropy power inequality have been developed for settings with dependence, including by Takano, Johnson, and Rioul. We extend these works by developing a quantitative version of the entropy power inequality for dependent random vectors. A notable consequence is that an entropy power inequality stated using conditional entropies holds for random vectors whose joint density is log-supermodular.
翻译:独立随机向量的熵功率不等式是信息论的基础性结果,与概率论及几何泛函分析有着深刻联系。针对存在依赖性的情形,已有若干熵功率不等式的扩展被提出,包括Takano、Johnson和Rioul等人的工作。我们通过建立依赖随机向量的定量版本熵功率不等式,扩展了这些研究。一个显著推论是:对于联合密度具有对数超模性质的随机向量,使用条件熵表述的熵功率不等式成立。