Information theory, introduced by Shannon, has been extremely successful and influential as a mathematical theory of communication. Shannon's notion of information does not consider the meaning of the messages being communicated but only their probability. Even so, computational approaches regularly appeal to "information processing" to study how meaning is encoded and decoded in natural and artificial systems. Here, we contrast Shannon information theory with integrated information theory (IIT), which was developed to account for the presence and properties of consciousness. IIT considers meaning as integrated information and characterizes it as a structure, rather than as a message or code. In principle, IIT's axioms and postulates allow one to "unfold" a cause-effect structure from a substrate in a state, a structure that fully defines the intrinsic meaning of an experience and its contents. It follows that, for the communication of meaning, the cause-effect structures of sender and receiver must be similar.
翻译:暂无翻译