This paper redefines information theory as a foundational mathematical discipline, extending beyond its traditional role in engineering applications. Building on Shannon's entropy, rate'--distortion theory, and Wyner'--Ziv coding, we show that all optimization methods can be interpreted as projections of continuous information onto discrete binary spaces. Numbers are not intrinsic carriers of meaning but codes of information, with binary digits (0 and 1) serving as universal symbols sufficient for all mathematical structures. Rate'--distortion optimization via Lagrangian multipliers connects quantization error directly to fundamental limits of representation, while Wyner'--Ziv coding admits a path integral interpretation over probability manifolds, unifying quantization, inference, geometry, and error. We further extend this framework into category theory, topological data analysis, and universal coding, situating computation and game theory as complementary perspectives. The result is a set of postulates that elevate information theory to the status of a universal mathematical language.
翻译:本文重新定义信息论为一门基础数学学科,超越其在工程应用中的传统角色。基于香农熵、率失真理论与Wyner-Ziv编码,我们证明所有优化方法均可解释为连续信息在离散二元空间上的投影。数字并非意义的固有载体,而是信息的编码,其中二进制数字(0和1)作为通用符号足以表达所有数学结构。通过拉格朗日乘子进行的率失真优化将量化误差直接与表示的基本极限相联系,而Wyner-Ziv编码允许在概率流形上进行路径积分解释,从而统一了量化、推断、几何与误差。我们进一步将此框架扩展至范畴论、拓扑数据分析与通用编码,将计算与博弈论定位为互补视角。最终提出一套公设,将信息论提升至通用数学语言的地位。