Shannon-Hartley theorem can accurately calculate the channel capacity when the signal observation time is infinite. However, the calculation of finite-time mutual information, which remains unknown, is essential for guiding the design of practical communication systems. In this paper, we investigate the mutual information between two correlated Gaussian processes within a finite-time observation window. We first derive the finite-time mutual information by providing a limit expression. Then we numerically compute the mutual information within a single finite-time window. We reveal that the number of bits transmitted per second within the finite-time window can exceed the mutual information averaged over the entire time axis, which is called the exceed-average phenomenon. Furthermore, we derive a finite-time mutual information formula under a typical signal autocorrelation case by utilizing the Mercer expansion of trace class operators, and reveal the connection between the finite-time mutual information problem and the operator theory. Finally, we analytically prove the existence of the exceed-average phenomenon in this typical case, and demonstrate its compatibility with the Shannon capacity.
翻译:当信号观测时间无限时, Shannon- Hartley 理论可以精确计算频道容量。 但是, 计算有限时间的相互信息( 仍然未知), 是指导实用通信系统设计的关键。 在本文中, 我们调查了两个相关高斯进程在有限时间观测窗口中的相互信息。 我们首先通过提供一定的表达方式来得出有限时间的相互信息。 然后用数字计算单一的有限时间窗口中的相互信息。 我们发现, 有限时间窗口中每秒传输的比特数量可以超过整个时间轴中平均的相互信息数量, 也就是超平均值现象。 此外, 我们利用微量级操作员的Mercer扩展, 在一个典型信号自动关系案例中得出一个固定时间的相互信息公式, 并揭示时间的相互信息问题和操作者理论之间的联系。 最后, 我们分析证明在这个典型的案例中存在超平均值的现象, 并证明它与香农的能力兼容性 。