Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our \textit{multiscale neural operator} is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.
翻译:气候、化学或天体物理学的数值模拟在计算上太昂贵,无法在高分辨率下进行不确定性量化或参数探索。 减序或代位模型是多个数量级,速度更快,但传统的代孕模型不灵活或不准确,纯机械学习(ML)的代孕模型太缺乏数据。 我们提出了一个混合的、灵活的代孕模型,利用已知物理学模拟大规模动态,将学习限制在硬型术语上,该术语被称为对称平衡或封闭,并捕捉微小到大比例动态的效果。 利用神经操作器,我们首先学习依靠电网的、非本地的和灵活的准位化。 我们的\textit{多尺度神经操作器}受多尺度模型中丰富文献的驱动,具有准线性运行时间复杂性,比在混编式多尺度Lorenz96上演示的状态准流体化或弹性更准确或更灵活。