This work proposes a general framework for capturing noise-driven transitions in spatially extended non-equilibrium systems and explains the emergence of coherent patterns beyond the instability onset. The framework relies on stochastic parameterizations to reduce the original equations' complexity while capturing the key effects of unresolved scales. It works for both Gaussian and Levy-type noise. Our parameterizations offer two key advantages. First, they approximate stochastic invariant manifolds when the latter exist. Second, even when such manifolds break down, our formulas can be adapted by a simple optimization of its constitutive parameters. This allows us to handle scenarios with weak time-scale separation where the system has undergone multiple transitions, resulting in large-amplitude solutions not captured by invariant manifold or other time-scale separation methods. The optimized stochastic parameterizations capture how small-scale noise impacts larger scales through the system's nonlinear interactions. This effect is achieved by the very fabric of our parameterizations incorporating non-Markovian coefficients into the reduced equation. Such coefficients account for the noise's past influence using a finite memory length, selected for optimal performance. The specific "memory" function, which determines how this past influence is weighted, depends on the noise's strength and how it interacts with the system's nonlinearities. Remarkably, training our theory-guided reduced models on a single noise path effectively learns the optimal memory length for out-of-sample predictions, including rare events. This success stems from our "hybrid" approach, which combines analytical understanding with data-driven learning. This combination avoids a key limitation of purely data-driven methods: their struggle to generalize to unseen scenarios, also known as the "extrapolation problem."
翻译:暂无翻译