EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0734
Title: Kullback-Leibler divergence and Akaike information criterion in general hidden Markov models Authors:  Chu-Lan Kao - National Yang Ming Chiao Tung University (Taiwan) [presenting]
Tianxiao Pang - Zhejiang University (China)
Cheng-Der Fuh - National Central University (Taiwan)
Abstract: To characterize the Kullback-Leibler divergence and Fisher information in general parametrized hidden Markov models, first, it is shown that the log-likelihood and its derivatives can be represented as an additive functional of a Markovian iterated function system, and then provide explicit characterizations of these two quantities through this representation. Moreover, it is shown that Kullback-Leibler divergence can be locally approximated by a quadratic function determined by the Fisher information. Results relating to the Cramer-Rao lower bound and the Hajek-Le Cam local asymptotic minimax theorem are also given. As an application of the results, a theoretical justification for using the Akaike information criterion (AIC) model selection in general hidden Markov models is provided. Last, three concrete models are studied: a Gaussian vector autoregressive-moving average model of order $(p,q)$, recurrent neural networks, and a temporally restricted Boltzmann machine to illustrate the theory.