This work builds on the models and concepts presented in part 1 to learn approximate dictionary representations of Koopman operators from data. Part I of this paper presented a methodology for arguing the subspace invariance of a Koopman dictionary. This methodology was demonstrated on the state-inclusive logistic lifting (SILL) basis. This is an affine basis augmented with conjunctive logistic functions. The SILL dictionary's nonlinear functions are homogeneous, a norm in data-driven dictionary learning of Koopman operators. In this paper, we discover that structured mixing of heterogeneous dictionary functions drawn from different classes of nonlinear functions achieve the same accuracy and dimensional scaling as the deep-learning-based deepDMD algorithm. We specifically show this by building a heterogeneous dictionary comprised of SILL functions and conjunctive radial basis functions (RBFs). This mixed dictionary achieves the same accuracy and dimensional scaling as deepDMD with an order of magnitude reduction in parameters, while maintaining geometric interpretability. These results strengthen the viability of dictionary-based Koopman models to solving high-dimensional nonlinear learning problems.
翻译:这项工作以第一部分提出的模型和概念为基础,从数据中学习Koopman操作员的近似字典表征。本文件第一部分介绍了一种方法,用以论证Koopman字典的子空间变化。这种方法是以国家包容性后勤提升(SILL)为基础的。这是一个折线基础,与连带后勤功能相加。SILL字典的非线性功能是同质的,这是Koopman操作员数据驱动字典学习的规范。在本文件中,我们发现,从不同类别的非线性功能中抽取的各种词典功能的结构化混合,与深学深DMD算法的精确度和维度缩放相同。我们特别通过建立一个由SILL函数和连线性无线电基函数(RBFs)组成的复合词典来显示这一点。这一混合词典的精度和维度缩度与深DMD相仿,在维持几何可解释性的同时,使基于字典的Koopman模型更适合解决高度非线性非线性学习问题。