We recall some of the history of the information-theoretic approach to deriving core results in probability theory and indicate parts of the recent resurgence of interest in this area with current progress along several interesting directions. Then we give a new information-theoretic proof of a finite version of de Finetti's classical representation theorem for finite-valued random variables. We derive an upper bound on the relative entropy between the distribution of the first $k$ in a sequence of $n$ exchangeable random variables, and an appropriate mixture over product distributions. The mixing measure is characterised as the law of the empirical measure of the original sequence, and de Finetti's result is recovered as a corollary. The proof is nicely motivated by the Gibbs conditioning principle in connection with statistical mechanics, and it follows along an appealing sequence of steps. The technical estimates required for these steps are obtained via the use of a collection of combinatorial tools known within information theory as `the method of types.'
翻译:我们回忆了在概率理论中得出核心结果的信息理论方法的一些历史,并指出了最近对这一领域重新引起兴趣的部分方面,目前沿着几个有趣的方向取得了进展。然后,我们提供了一个新的信息理论证据,证明Freetti传统代表理论的有限版本,用于定值随机变量。我们从第一个K美元按可互换的随机变量序列分配的相对酶与产品分配的适当混合之间得出了一个上层界限。混合措施被定性为原始序列的经验计量法,而De Finetti的结果则被回收为必然结果。证据的正确动机是Gibbs调整原则与统计力有关的部分,它沿循着一系列具有吸引力的步骤。这些步骤所需的技术估计是通过使用信息理论中称为“类型方法”的组合工具获得的。