Recognition: unknown
Quantifying synergistic mutual information
read the original abstract
Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
Quantifying Spacetime Integration across a Partition with Synergy
Synergy-based measures from partial information decomposition are found more suitable than current practice for quantifying integration in simple deterministic networks for the Information Integration Theory of Consciousness.
-
Quantifying Spacetime Integration across a Partition with Synergy
Synergy-based measures of spacetime integration outperform current IIT practice when tested on simple deterministic networks.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.