A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of $n$ independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as $n\to\infty$.
翻译:建立离散随机变量中央限值理论的强化版本, 仅依靠信息理论工具和基本参数。 显示独立和相同分布的无线随机变量和适当离散的高斯之间标准总和( 美元) 的相对倍数, 以 $\ t\ inty$ 消失 。