We perform approximate inference in state-space models that allow for nonlinear higher-order Markov chains in latent space. The conditional independencies of the generative model enable us to parameterize only an inference model, which learns to estimate clean states in a self-supervised manner using maximum likelihood. First, we propose a recurrent method that is trained directly on noisy observations. Afterward, we cast the model such that the optimization problem leads to an update scheme that backpropagates through a recursion similar to the classical Kalman filter and smoother. In scientific applications, domain knowledge can give a linear approximation of the latent transition maps. We can easily incorporate this knowledge into our model, leading to a hybrid inference approach. In contrast to other methods, experiments show that the hybrid method makes the inferred latent states physically more interpretable and accurate, especially in low-data regimes. Furthermore, we do not rely on an additional parameterization of the generative model or supervision via uncorrupted observations or ground truth latent states. Despite our model's simplicity, we obtain competitive results on the chaotic Lorenz system compared to a fully supervised approach and outperform a method based on variational inference.
翻译:我们用国家空间模型进行近似推论,使潜层空间中的非线性较高顺序的Markov链链得以使用。基因模型的有条件依赖性使我们能够仅将一个推论模型参数化,该模型学会使用最大的可能性,以自我监督的方式估计干净状态。首先,我们建议一种经常性方法,直接进行关于噪音观测的培训。随后,我们投下这种模型,以便优化问题导致一个更新方案,通过类似于古典Kalman过滤器和光滑器的循环性来反向调整。在科学应用中,域知识可以使潜层过渡图线性近似。我们可以很容易地将这种知识纳入我们的模型,从而导致一种混合推论方法。与其他方法相比,实验表明混合方法使推断的潜值状态在物理上更加易懂和准确,特别是在低数据系统中。此外,我们并不依赖通过不调的观测或地面真相潜伏状态来对变异模型或监测进行额外的参数化。尽管我们的模型很简单,但我们在混乱的Lorenz系统上取得了竞争性的结果,而与完全受监督的方法相比,在变形法的基础上,我们在变异法上取得了竞争性的结果。