In this paper we consider the convergence of the conditional entropy to the entropy rate for Markov chains. Convergence of certain statistics of long range dependent processes, such as the sample mean, is slow. It has been shown in Carpio and Daley \cite{carpio2007long} that the convergence of the $n$-step transition probabilities to the stationary distribution is slow, without quantifying the convergence rate. We prove that the slow convergence also applies to convergence to an information-theoretic measure, the entropy rate, by showing that the convergence rate is equivalent to the convergence rate of the $n$-step transition probabilities to the stationary distribution, which is equivalent to the Markov chain mixing time problem. Then we quantify this convergence rate, and show that it is $O(n^{2H-2})$, where $n$ is the number of steps of the Markov chain and $H$ is the Hurst parameter. Finally, we show that due to this slow convergence, the mutual information between past and future is infinite if and only if the Markov chain is long range dependent. This is a discrete analogue of characterisations which have been shown for other long range dependent processes.
翻译:在本文中,我们考虑的是有条件的诱变率与Markov链条的诱变率的趋同。 某些长期依赖过程( 如样本平均值) 的统计数据的趋同速度缓慢。 Carpio 和 Daley 和 Daley 和 Daloy 显示, 美元与固定分布的逐步过渡概率的趋同速度缓慢, 但没有量化趋同率 。 我们证明缓慢的趋同率也适用于信息- 理论测量的趋同率, 即诱变率, 表明趋同率相当于 $ 的累进过渡概率与固定分布的趋同率, 这相当于 Markov 链 混合时间问题 。 然后我们量化这一趋同率, 并表明 $ (n ⁇ 2H), 美元 是 Markov 链 的步数, $ 和 $ 美元 是 Hurst 参数 。 最后, 我们证明, 由于这种缓慢的趋同, 过去与未来之间的相互信息是无限的, 只有当Markov 链系具有长期的离差特性时, 。