Generative Bayesian Filtering (GBF) provides a powerful and flexible framework for performing posterior inference in complex nonlinear and non-Gaussian state-space models. Our approach extends Generative Bayesian Computation (GBC) to dynamic settings, enabling recursive posterior inference using simulation-based methods powered by deep neural networks. GBF does not require explicit density evaluations, making it particularly effective when observation or transition distributions are analytically intractable. To address parameter learning, we introduce the Generative-Gibbs sampler, which bypasses explicit density evaluation by iteratively sampling each variable from its implicit full conditional distribution. Such technique is broadly applicable and enables inference in hierarchical Bayesian models with intractable densities, including state-space models. We assess the performance of the proposed methodologies through both simulated and empirical studies, including the estimation of $α$-stable stochastic volatility models. Our findings indicate that GBF significantly outperforms existing likelihood-free approaches in accuracy and robustness when dealing with intractable state-space models.
翻译:生成式贝叶斯滤波(GBF)为复杂非线性与非高斯状态空间模型中的后验推断提供了一个强大而灵活的框架。该方法将生成式贝叶斯计算(GBC)扩展至动态场景,利用基于深度神经网络的模拟方法实现递归后验推断。GBF无需显式密度评估,因此在观测或转移分布解析不可行时尤为有效。针对参数学习问题,我们提出了生成式吉布斯采样器,该方法通过从隐式全条件分布中迭代采样各变量,绕过了显式密度评估。该技术具有广泛适用性,可在包含状态空间模型在内的密度不可处理的层次贝叶斯模型中实现推断。我们通过仿真与实证研究(包括α稳定随机波动率模型的估计)评估了所提方法的性能。结果表明,在处理不可处理的状态空间模型时,GBF在准确性与鲁棒性上显著优于现有的无似然方法。