A novel sequential inferential method for Bayesian dynamic generalised linear models is presented, addressing both univariate and multivariate $k$-parametric exponential families. It efficiently handles diverse responses, including multinomial, gamma, normal, and Poisson distributed outcomes, by leveraging the conjugate and predictive structure of the exponential family. The approach integrates information geometry concepts, such as the projection theorem and Kullback-Leibler divergence, and aligns with recent advances in variational inference. Applications to both synthetic and real datasets highlight its computational efficiency and scalability, surpassing alternative methods. The approach supports the strategic integration of new information, facilitating monitoring, intervention, and the application of discount factors, which are typical in sequential analyses. The R package kDGLM is available for direct use by applied researchers, facilitating the implementation of the method for specific k-parametric dynamic generalised models.
翻译:暂无翻译