Quantifying the relations (e.g., similarity) between complex networks paves the way for studying the latent information shared across networks. However, fundamental relation metrics are not well-defined between networks. As a compromise, prevalent techniques measure network relations in data-driven manners, which are inapplicable to analytic derivations in physics. To resolve this issue, we present a theory for obtaining an optimal characterization of network topological properties. We show that a network can be fully represented by a Gaussian variable defined by a function of the Laplacian, which simultaneously satisfies network-topology-dependent smoothness and maximum entropy properties. Based on it, we can analytically measure diverse relations between complex networks. As illustrations, we define encoding (e.g., information divergence and mutual information), decoding (e.g., Fisher information), and causality (e.g., Granger causality and conditional mutual information) between networks. We validate our framework on representative networks (e.g., random networks, protein structures, and chemical compounds) to demonstrate that a series of science and engineering challenges (e.g., network evolution, embedding, and query) can be tackled from a new perspective. An implementation of our theory is released as a multi-platform toolbox.
翻译:暂无翻译