In an effort to develop the foundations for a non-stochastic theory of information, the notion of $\delta$-mutual information between uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $\delta$-mutual information between received and transmitted codewords over $\epsilon$-noise channels equals the $(\epsilon, \delta)$-capacity. This notion of capacity generalizes the Kolmogorov $\epsilon$-capacity to packing sets of overlap at most $\delta$, and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, and to non-stochastic, memoryless, stationary channels. Finally, sufficient conditions are established for the factorization of the $\delta$-mutual information and to obtain a single letter capacity expression. Compared to previous non-stochastic approaches, the presented theory admits the possibility of decoding errors as in Shannon's probabilistic setting, while retaining a worst-case, non-stochastic character.
翻译:为了努力为非随机信息理论奠定基础,在不确定变量之间引入了美元-美元-相互信息的概念,作为纳伊尔非随机信息功能的通用概念。演示了这一新数量的几种属性,用以证明在非随机环境中频道编码理论。也就是说,显示收到和传送的比美元-美元-新渠道的最大代号之间的美元-相互信息等于美元-相互能力。这种能力概念一般化了科尔莫戈洛夫($-emplon)的非随机信息功能。这种能力概念一般化了高尔莫戈洛夫($-eplon-confallon-conference)的包装能力,最多为$\delta$,是一位作者提出的先前定义的变异。结果随后扩大到更普遍的噪音模型,以及非随机、不记忆、固定的频道。最后,为计算美元-双元-双元信息并获得一个最坏的字母表达能力,同时将一个最坏的代谢性、最坏的代谢性、最坏的代号表达方式,与先前的非代谢性、最坏的代谢性、最坏的代言变的可能。