Graph sparsification is an area of interest in computer science and applied mathematics. Sparsification of a graph, in general, aims to reduce the number of edges in the network while preserving specific properties of the graph, like cuts and subgraph counts. Computing the sparsest cuts of a graph is known to be NP-hard, and sparsification routines exists for generating linear sized sparsifiers in almost quadratic running time $O(n^{2 + \epsilon})$. Consequently, obtaining a sparsifier can be a computationally demanding task and the complexity varies based on the level of sparsity required. In this study, we extend the concept of sparsification to the realm of reaction-diffusion complex systems. We aim to address the challenge of reducing the number of edges in the network while preserving the underlying flow dynamics. To tackle this problem, we adopt a relaxed approach considering only a subset of trajectories. We map the network sparsification problem to a data assimilation problem on a Reduced Order Model (ROM) space with constraints targeted at preserving the eigenmodes of the Laplacian matrix under perturbations. The Laplacian matrix ($L = D - A$) is the difference between the diagonal matrix of degrees ($D$) and the graph's adjacency matrix ($A$). We propose approximations to the eigenvalues and eigenvectors of the Laplacian matrix subject to perturbations for computational feasibility and include a custom function based on these approximations as a constraint on the data assimilation framework. We demonstrate the extension of our framework to achieve sparsity in parameter sets for Neural Ordinary Differential Equations (neural ODEs).
翻译:暂无翻译