Transfer entropy is an established method for quantifying directed statistical dependencies in neuroimaging and complex systems datasets. The pairwise (or bivariate) transfer entropy from a source to a target node in a network does not depend solely on the local source-target link weight, but on the wider network structure that the link is embedded in. This relationship is studied using a discrete-time linearly-coupled Gaussian model, which allows us to derive the transfer entropy for each link from the network topology. It is shown analytically that the dependence on the directed link weight is only a first approximation, valid for weak coupling. More generally, the transfer entropy increases with the in-degree of the source and decreases with the in-degree of the target, indicating an asymmetry of information transfer between hubs and low-degree nodes. In addition, the transfer entropy is directly proportional to weighted motif counts involving common parents or multiple walks from the source to the target, which are more abundant in networks with a high clustering coefficient than in random networks. Our findings also apply to Granger causality, which is equivalent to transfer entropy for Gaussian variables. Moreover, similar empirical results on random Boolean networks suggest that the dependence of the transfer entropy on the in-degree extends to nonlinear dynamics.
翻译:在神经成像和复杂系统数据集中,转移诱导是量化直接统计依赖性的既定方法。对称(或双变)从源向网络目标节点的转移引力,并不完全取决于本地源目标链接的重量,而是取决于链接所嵌入的更广泛的网络结构。使用离位时间线性线性组合高斯安模型研究这种关系,使我们能够从网络表层中得出每个链接的转移导引力。从分析中显示,对定向链接重量的依赖只是一种初近近,对薄弱的组合作用有效。更一般地说,转移引力随着源的内位而增加,而随着目标的内位而减少,表明中心与低度节点之间的信息传输不对称。此外,转移诱导力直接与从源到目标的加权模式计数成成比例成比例成比例成正比。我们的调查结果还适用于Granger 源的内位性增量性增量性增量性网络,这相当于向导力性变数的变数。