The Gaussian process state-space model (GPSSM) has attracted much attention over the past decade. However, the model representation power of the GPSSM is far from satisfactory. Most GPSSM studies rely on the standard Gaussian process (GP) with a preliminary kernel, such as the squared exponential (SE) kernel or Mat\'{e}rn kernel, which limits the model representation power and its application in complex scenarios. To address this issue, this paper proposes a novel class of probabilistic state-space models, called TGPSSMs. By leveraging a parametric normalizing flow, the TGPSSMs enrich the GP priors in the standard GPSSM, rendering the state-space model more flexible and expressive. Additionally, we present a scalable variational inference algorithm for learning and inference in TGPSSMs, which provides a flexible and optimal structure for the variational distribution of latent states. The algorithm is interpretable and computationally efficient owing to the sparse representation of GP and the bijective nature of normalizing flow. To further improve the learning and inference performance of the proposed algorithm, we integrate a constrained optimization framework to enhance the state-space representation capabilities and optimize the hyperparameters. The experimental results based on various synthetic and real datasets corroborate that the proposed TGPSSM yields superior learning and inference performance compared to several state-of-the-art methods. The accompanying source code is available at \url{https://github.com/zhidilin/TGPSSM}.
翻译:高斯过程状态空间模型(GPSSM)在过去十年中引起了广泛关注。然而,GPSSM 的模型表示能力远未令人满意。许多 GPSSM 研究依赖于标准高斯过程(GP),其中预备的内核,如平方指数(SE)内核或 Matern 内核,限制了模型表示能力以及其在复杂情况下的应用。为解决这个问题,本文提出了一种新的概率状态空间模型类,称为 TGPSSM。通过利用参数正则化流,TGPSSM 丰富了标准 GPSSM 的 GP 先验,使状态空间模型更加灵活和表达。此外,我们提出了一种可扩展的变分推理算法,用于 TGPSSM 中的学习和推断,为潜在状态的变分分布提供了灵活和最优的结构。该算法具有可解释性和计算效率,因为 GP 的稀疏表示和正则化流的双射性质。为了进一步提高所提出算法的学习和推断性能,我们集成了一个约束优化框架,以增强状态空间表示能力并优化超参数。基于各种合成和真实数据集的实验结果证实,所提出的 TGPSSM 相对于几种最先进的方法具有更好的学习和推断性能。随附源代码可在 \url{https://github.com/zhidilin/TGPSSM} 获得。