To characterize the Kullback-Leibler divergence and Fisher information in general parametrized hidden Markov models, in this paper, we first show that the log likelihood and its derivatives can be represented as an additive functional of a Markovian iterated function system, and then provide explicit characterizations of these two quantities through this representation. Moreover, we show that Kullback-Leibler divergence can be locally approximated by a quadratic function determined by the Fisher information. Results relating to the Cram\'{e}r-Rao lower bound and the H\'{a}jek-Le Cam local asymptotic minimax theorem are also given. As an application of our results, we provide a theoretical justification of using Akaike information criterion (AIC) model selection in general hidden Markov models. Last, we study three concrete models: a Gaussian vector autoregressive-moving average model of order $(p,q)$, recurrent neural networks, and temporal restricted Boltzmann machine, to illustrate our theory.
翻译:为了将Kullback- Leibler差异和渔业信息描述为一般的平衡隐藏的Markov模型,我们首先在本文中显示,日志可能性及其衍生物可以作为Markovian迭代功能系统的添加功能来表示,然后通过这个表示对这两个数量进行明确的描述。此外,我们还表明,Kullback-Lebel差异可以用渔业信息确定的一个四边函数来比较当地。与Cram\\ {e}r-Rao下拉线和H\{a}Le Cam当地迷你迷你理论有关的结果。作为我们结果的应用,我们提供了在一般隐藏的Markov模型中使用Akaike信息标准(AIC)模型选择的理论理由。最后,我们研究了三种具体模型:高斯矢量自动递增平均序列模型$(p,q)美元、经常性神经网络和受时间限制时间限制的波尔茨曼机器,以说明我们的理论。</s>