Nonlinear principal component analysis (nlPCA) via autoencoders has attracted attention in the dynamical systems community due to its larger compression rate when compared to linear principal component analysis (PCA). These model reduction methods experience an increase in the dimensionality of the latent space when applied to datasets that exhibit globally invariant samples due to the presence of symmetries. In this study, we introduce a novel machine learning embedding in the autoencoder, which uses spatial transformer networks and Siamese networks to account for continuous and discrete symmetries, respectively. The spatial transformer network discovers the optimal shift for the continuous translation or rotation so that invariant samples are aligned in the periodic directions. Similarly, the Siamese networks collapse samples that are invariant under discrete shifts and reflections. Thus, the proposed symmetry-aware autoencoder is invariant to predetermined input transformations dictating the dynamics of the underlying physical system. This embedding can be employed with both linear and nonlinear reduction methods, which we term symmetry-aware PCA (s-PCA) and symmetry-aware nlPCA (s-nlPCA). We apply the proposed framework to 3 fluid flow problems: Burgers' equation, the simulation of the flow through a step diffuser and the Kolmogorov flow to showcase the capabilities for cases exhibiting only continuous symmetries, only discrete symmetries or a combination of both.
翻译:通过自动读取器进行的非线性主元件分析(nlPCAA)已经引起动态系统群落的注意,因为与线性主元件分析(PCA)相比,动态系统群群群群的压缩率较高。这些模型的减少方法在应用到显示全球异差样本的数据集时,其潜在空间的维度会因存在对称性而增加。在本研究中,我们引入了一种新的机器学习嵌入自动编码器,它分别使用空间变压器网络和Siamse网络来计算连续和离散的对称。空间变压器网络发现连续翻译或旋转的最佳转变,从而使异性样本在周期方向上保持一致。同样,在离散变化和反射的情况下,暹米网络的样本会崩溃。因此,拟议的对称性自觉自动解码转换器与预先设定的输入变异性转换,只用于定位组合体系统的动态。这种嵌入只能用于线性和非线性缩减方法,我们称之为对流-对流-对流-对流-对流-对流-对流-对流-对流-对流的正-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对流-对-对流-对流-对流-对流-对流-对流-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-对-