Digital twins have emerged as a key technology for optimizing the performance of engineering products and systems. High-fidelity numerical simulations constitute the backbone of engineering design, providing an accurate insight into the performance of complex systems. However, large-scale, dynamic, non-linear models require significant computational resources and are prohibitive for real-time digital twin applications. To this end, reduced order models (ROMs) are employed, to approximate the high-fidelity solutions while accurately capturing the dominant aspects of the physical behavior. The present work proposes a new machine learning (ML) platform for the development of ROMs, to handle large-scale numerical problems dealing with transient nonlinear partial differential equations. Our framework, mentioned as $\textit{FastSVD-ML-ROM}$, utilizes $\textit{(i)}$ a singular value decomposition (SVD) update methodology, to compute a linear subspace of the multi-fidelity solutions during the simulation process, $\textit{(ii)}$ convolutional autoencoders for nonlinear dimensionality reduction, $\textit{(iii)}$ feed-forward neural networks to map the input parameters to the latent spaces, and $\textit{(iv)}$ long short-term memory networks to predict and forecast the dynamics of parametric solutions. The efficiency of the $\textit{FastSVD-ML-ROM}$ framework is demonstrated for a 2D linear convection-diffusion equation, the problem of fluid around a cylinder, and the 3D blood flow inside an arterial segment. The accuracy of the reconstructed results demonstrates the robustness and assesses the efficiency of the proposed approach.
翻译:双胞胎数字双胞胎已成为优化工程产品和系统性能的关键技术。高纤维数字模拟是工程设计的主干,提供了对复杂系统性能的准确洞察力。然而,大规模、动态和非线性模型需要大量的计算资源,对实时数字双应用程序来说是令人望而却步的。为此,采用了降低定序模型(ROMs),以近似高纤维性能解决方案,同时准确地捕捉物理行为的主要方面。目前的工作提议了一个新的机器学习(ML)平台,用于开发ROM,处理涉及瞬态非线性非线性线性部分差异方程式的大规模数字问题。我们称为 $\ textit{FastSVD-MLM-ROM}的框架,使用美元单值解析方法,以在模拟过程中对多纤维性能解决方案的线性亚空间。 $Tral-ral-rallialal-ral-lal-loral-lational-ral-ral-ral-lational-ral-ral-ral-ral-ral-ral-reval-ral-ral-ral-ral-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-lev-l-l-l-l-l-l-lev-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l