Enhancing the sparsity of data-driven reduced-order models (ROMs) has gained increasing attention in recent years. In this work, we analyze an efficient approach to identifying skillful ROMs with a sparse structure using an information-theoretic indicator called causation entropy. The causation entropy quantifies in a statistical way the additional contribution of each term to the underlying dynamics beyond the information already captured by all the other terms in the ansatz. By doing so, the causation entropy assesses the importance of each term to the dynamics before a parameter estimation procedure is performed. Thus, the approach can be utilized to eliminate terms with little dynamic impact, leading to a parsimonious structure that retains the essential physics. To circumvent the difficulty of estimating high-dimensional probability density functions (PDFs) involved in the causation entropy computation, we leverage Gaussian approximations for such PDFs, which are demonstrated to be sufficient even in the presence of highly non-Gaussian dynamics. The effectiveness of the approach is illustrated by the Kuramoto-Sivashinsky equation by building sparse causation-based ROMs for various purposes, such as recovering long-term statistics and inferring unobserved dynamics via data assimilation with partial observations.
翻译:暂无翻译