Since its introduction, the partial information decomposition (PID) has emerged as a powerful, information-theoretic technique useful for studying the structure of (potentially higher-order) interactions in complex systems. Despite its utility, the applicability of the PID is restricted by the need to assign elements as either inputs or targets, as well as the specific structure of the mutual information itself. Here, we introduce a generalized information decomposition that relaxes the source/target distinction while still satisfying the basic intuitions about information. This approach is based on the decomposition of the Kullback-Leibler divergence, and consequently allows for the analysis of any information gained when updating from an arbitrary prior to an arbitrary posterior. Consequently, any information-theoretic measure that can be written in as a Kullback-Leibler divergence admits a decomposition in the style of Williams and Beer, including the total correlation, the negentropy, and the mutual information as special cases. In this paper, we explore how the generalized information decomposition can reveal novel insights into existing measures, as well as the nature of higher-order synergies. We show that synergistic information is intimately related to the well-known Tononi-Sporns-Edelman (TSE) complexity, and that synergistic information requires a similar integration/segregation balance as a high TSE complexity. Finally, we end with a discussion of how this approach fits into other attempts to generalize the PID and the possibilities for empirical applications.
翻译:暂无翻译