This article serves as a brief introduction to the Shannon information theory. Concepts of information, Shannon entropy and channel capacity are mainly covered. All these concepts are developed in a totally combinatorial flavor. Some issues usually not addressed in the literature are discussed here as well. In particular, we show that it seems we can define channel capacity differently which allows us to potentially transmit more messages in a fixed sufficient long time duration. However, for a channel carrying a finite number of letters, the channel capacity unfortunately remains the same as the Shannon limit.
翻译:本文是对香农信息理论的简短介绍,主要包括信息概念、香农通气和频道能力,所有这些概念都是以完全组合的口味形成的,文献中通常没有涉及的一些问题也在这里讨论,特别是,我们表明,我们似乎可以对频道能力作出不同的定义,从而使我们能够在固定的足够长的时间内传送更多的信息。然而,对于一个有一定数量字母的频道,频道能力不幸与香农限值相同。