pith. machine review for the scientific record. sign in

arxiv: 1004.2515 · v1 · submitted 2010-04-14 · 💻 cs.IT · math-ph· math.IT· math.MP· physics.bio-ph· physics.data-an· q-bio.NC· q-bio.QM

Recognition: unknown

Nonnegative Decomposition of Multivariate Information

Authors on Pith no claims yet
classification 💻 cs.IT math-phmath.ITmath.MPphysics.bio-phphysics.data-anq-bio.NCq-bio.QM
keywords informationredundancyinteractionmultivariatesourcesatomsdecompositiondefinition
0
0 comments X
read the original abstract

Of the various attempts to generalize information theory to multiple variables, the most widely utilized, interaction information, suffers from the problem that it is sometimes negative. Here we reconsider from first principles the general structure of the information that a set of sources provides about a given variable. We begin with a new definition of redundancy as the minimum information that any source provides about each possible outcome of the variable, averaged over all possible outcomes. We then show how this measure of redundancy induces a lattice over sets of sources that clarifies the general structure of multivariate information. Finally, we use this redundancy lattice to propose a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources. Unlike interaction information, the atoms of our partial information decomposition are never negative and always support a clear interpretation as informational quantities. Our analysis also demonstrates how the negativity of interaction information can be explained by its confounding of redundancy and synergy.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 11 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Closed-Form Gaussian Estimators for Multi-Source Partial Information Decomposition

    cs.IT 2026-05 unverdicted novelty 7.0

    Closed-form log-determinant expressions provide the first covariance-based estimators for multi-source PID quantities including redundancy, unique information, and synergy in Gaussian variables.

  2. Task Relevance Is Not Local Replaceability: A Two-Axis View of Channel Information

    cs.CV 2026-05 unverdicted novelty 7.0

    Channel importance splits into task relevance and local replaceability; local-axis metrics predict safe removal under pruning better than target-axis metrics across multiple CNNs and datasets.

  3. Quantifying Spacetime Integration across a Partition with Synergy

    cs.IT 2026-04 unverdicted novelty 7.0

    Synergy-based measures from partial information decomposition are found more suitable than current practice for quantifying integration in simple deterministic networks for the Information Integration Theory of Consciousness.

  4. Quantifying Spacetime Integration across a Partition with Synergy

    cs.IT 2026-04 unverdicted novelty 7.0

    Synergy-based measures of spacetime integration outperform current IIT practice when tested on simple deterministic networks.

  5. Structural Impossibility of Antichain-Lattice Partial Information Decomposition

    cs.IT 2026-04 unverdicted novelty 7.0

    Antichain-lattice indexing in PID is structurally insufficient to recover mutual information from information atoms for multivariate cases.

  6. Emergence of information interference in stochastic systems with non-diagonal noise and switching environments

    cond-mat.stat-mech 2026-05 unverdicted novelty 6.0

    In stochastic systems with non-diagonal noise and switching environments, mutual information includes irreducible static and dynamic interference terms that prevent simple decomposition.

  7. Partial Effective Information Decomposition for Synergistic Causality

    stat.ML 2026-05 unverdicted novelty 6.0

    PEID decomposes the causal effect of multiple sources on a target under maximum-entropy interventions into unique and synergistic information, enabling hyperedge causal graphs and downward causation analysis.

  8. Heterophily as a generative mechanism for self-organized synergistic interdependencies

    physics.soc-ph 2026-04 unverdicted novelty 6.0

    Heterophily weakens pairwise couplings while inducing geometric constraints that create synergistic higher-order interdependencies in a co-evolving spin-glass model.

  9. More Is Different: Toward a Theory of Emergence in AI-Native Software Ecosystems

    cs.SE 2026-04 unverdicted novelty 5.0

    AI-native software ecosystems exhibit emergent behaviors best explained by complex adaptive systems theory, requiring new ecosystem-level monitoring and seven testable propositions that may extend or replace Lehman's laws.

  10. ConceptTracer: Interactive Analysis of Concept Saliency and Selectivity in Neural Representations

    cs.LG 2026-04 unverdicted novelty 5.0

    ConceptTracer supplies an interactive interface and saliency/selectivity metrics to locate concept-responsive neurons in neural representations, shown on TabPFN.

  11. PrismNet: Viewing Time Series Through a Multi-Modal Prism for Interpretable Power Load Forecasting

    eess.SP 2026-05 unverdicted novelty 4.0

    PrismNet combines text and image modalities with time series via a PID-guided contrastive learning module to boost few-shot power load forecasting accuracy and provide interpretability.