pith. machine review for the scientific record. sign in

arxiv: 2604.18635 · v3 · submitted 2026-04-18 · 💻 cs.IT · math.IT

Recognition: 2 theorem links

· Lean Theorem

Quantifying Spacetime Integration across a Partition with Synergy

Authors on Pith no claims yet

Pith reviewed 2026-05-12 02:19 UTC · model grok-4.3

classification 💻 cs.IT math.IT
keywords partial information decompositioninformation integrationIITconsciousnesssynergydynamical systemscomplexity measuresspacetime integration
0
0 comments X

The pith

Synergy-based measures from partial information decomposition better quantify spacetime integration for IIT than current practice.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces four integration measures built on the partial information decomposition framework to strengthen the mathematical basis of the Information Integration Theory of Consciousness. These measures are compared against existing IIT methods using simple deterministic networks, with the finding that synergy-focused variants align more closely with IIT goals. A sympathetic reader would care because improved quantification of integration could sharpen tests of whether a system is conscious and offer general tools for measuring complexity in discrete dynamical systems. The work stays grounded in direct comparisons rather than broad claims about all networks.

Core claim

The paper establishes that four measures of integration derived from partial information decomposition, particularly those centered on synergy, provide a more suitable quantification of spacetime integration across partitions than the measures currently employed in IIT, as shown by explicit comparisons in simple deterministic networks. This holds because synergy captures the non-redundant, integrated contributions that IIT seeks to quantify, while current practice does not isolate these as effectively in the tested cases.

What carries the argument

Partial information decomposition framework, which decomposes joint information into unique, redundant, and synergistic atoms to isolate synergy as the basis for integration measures.

If this is right

  • Synergy-based measures are more suitable for IIT's use-case than current practice.
  • These measures serve as useful complexity metrics for discrete dynamical systems beyond IIT.
  • Integration is quantified by isolating synergistic information contributions across space and time.
  • The four new measures provide concrete alternatives that can be computed directly from system states.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • If the preference for synergy holds more generally, these measures could be used to compare integration levels across different classes of dynamical systems in neuroscience.
  • The framework might allow direct tests of whether adding synergistic structure increases a system's integration score in a predictable way.
  • Outside consciousness research, the measures offer a way to track how integration changes under system perturbations without relying on IIT-specific assumptions.

Load-bearing premise

That the advantage of synergy-based measures seen in simple deterministic networks will extend to the broader, more varied networks and use cases relevant to IIT.

What would settle it

Applying the synergy-based measures and current IIT measures side-by-side to a larger stochastic or biological network and finding that current measures better match IIT's intended integration values.

Figures

Figures reproduced from arXiv: 2604.18635 by Virgil Griffith.

Figure 1
Figure 1. Figure 1: We compare IIT4’s ⟨ϕ 2023 c ⟩ to ⟨ϕ S1 c ⟩ more carefully in triplet networks. We give the min and max ϕ 2023 c values to provide more insight into why ϕ 2023 c drops so precipitously. In our view, a more intuitive complexity measure would increase left to right, which ⟨ϕ S1 c ⟩ does, yet ⟨ϕ 2023 c ⟩ does the opposite. To avoid claims of sleight-of-hand, ϕc is computed using a normalized MIP and ϕ S1 s is … view at source ↗
Figure 2
Figure 2. Figure 2: Eight doublet networks with transition tables. [PITH_FULL_IMAGE:figures/full_fig_p016_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: State-dependent ϕ 2023 c and ⟨ϕ 2023 c ⟩ tell the same story—the ϕ 2023 c value of GET4 trounces the ϕ 2023 c of the other three networks. A more intuitive complexity measure would instead increase left to right, which ⟨ϕ S1 c ⟩ does. To avoid claims of slight-of-hand, ϕ 2023 c is computed using a normalized MIP and ⟨ϕ S1 s ⟩ is computed using an unnormalized MIP. We are seeing ϕ 2023 c at its best, and ϕ … view at source ↗
read the original abstract

In service to the mathematical underpinnings of the Information Integration Theory of Consciousness (IIT), we introduce four measures of integration based on the partial information decomposition framework. We compare our measures to current IIT practice in simple deterministic networks. We find synergy-based measures more suitable for IIT's use-case than current practice. Outside IIT, these measures could also be useful as measures of complexity for discrete dynamical systems.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript introduces four measures of integration derived from the partial information decomposition (PID) framework, motivated by the needs of the Information Integration Theory of Consciousness (IIT). It performs comparisons against existing IIT integration measures exclusively in small, fully deterministic networks and concludes that the synergy-based PID measures are more suitable for IIT's use-case; it also suggests the measures may serve as complexity quantifiers for discrete dynamical systems.

Significance. If the reported advantage of the synergy measures survives beyond the tested regime, the work would supply a principled alternative to current IIT integration quantifiers and a new tool for analyzing integration/complexity in discrete systems. The explicit grounding in PID is a methodological strength, as is the focus on spacetime partitions.

major comments (2)
  1. [Abstract and Methods] Abstract and Methods: The suitability claim for IIT is drawn directly from comparisons performed only in small, fully deterministic networks. IIT's core applications routinely involve recurrent, stochastic, or larger-scale systems; the manuscript contains no scaling analysis, analytic argument, or additional experiments showing that the observed advantage persists when determinism is relaxed or network size increases. This extrapolation is load-bearing for the central conclusion.
  2. [Results] Results: Without reported details on the precise network topologies, state-space sizes, or quantitative effect sizes (e.g., how much better the synergy measures capture integration relative to baselines), it is impossible to judge whether the advantage is robust or merely an artifact of the deterministic setting.
minor comments (2)
  1. [Methods] The four PID-based measures are introduced without explicit equations or algorithmic pseudocode in the main text; placing these in an appendix or dedicated subsection would improve reproducibility.
  2. [Notation] Notation for the synergy atoms and the spacetime partition is introduced without a consolidated table of symbols, making cross-references between sections harder to follow.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the constructive feedback. We address each major comment below, agreeing where revisions are needed to strengthen the manuscript while defending the scope and contributions of the current work.

read point-by-point responses
  1. Referee: [Abstract and Methods] Abstract and Methods: The suitability claim for IIT is drawn directly from comparisons performed only in small, fully deterministic networks. IIT's core applications routinely involve recurrent, stochastic, or larger-scale systems; the manuscript contains no scaling analysis, analytic argument, or additional experiments showing that the observed advantage persists when determinism is relaxed or network size increases. This extrapolation is load-bearing for the central conclusion.

    Authors: We agree that the comparisons are limited to small, fully deterministic networks, as stated in the abstract. The manuscript's central finding is that synergy-based measures outperform existing IIT quantifiers within this controlled regime; we do not assert that the advantage necessarily holds universally. We will revise the abstract, introduction, and discussion to include explicit caveats emphasizing that further validation in stochastic, recurrent, and larger networks is required before broader claims can be made. No scaling analysis or analytic generalization argument is included because the paper focuses on introducing the PID-derived measures and demonstrating their behavior in the simplest cases where ground-truth integration can be unambiguously defined. revision: partial

  2. Referee: [Results] Results: Without reported details on the precise network topologies, state-space sizes, or quantitative effect sizes (e.g., how much better the synergy measures capture integration relative to baselines), it is impossible to judge whether the advantage is robust or merely an artifact of the deterministic setting.

    Authors: We accept that additional quantitative detail would improve interpretability. The manuscript describes the networks as small deterministic systems (e.g., chains, cycles, and fully connected topologies with binary node states), but we will expand the results section and add a supplementary table specifying exact topologies, node counts (3–5 nodes, yielding state spaces of size 8–32), and numerical effect sizes, including the magnitude of improvement of each synergy measure over the IIT baseline for every tested network. revision: yes

Circularity Check

0 steps flagged

No circularity; new measures defined from PID and evaluated empirically

full rationale

The paper introduces four PID-based synergy measures of integration, defines them mathematically, and compares their behavior to existing IIT practice via direct computation on small deterministic networks. No step equates a claimed prediction or result to its own inputs by construction, no parameter is fitted on a subset and then renamed as a prediction, and no load-bearing premise reduces to a self-citation whose content is unverified. The suitability conclusion is presented as an empirical observation within the tested regime rather than a deductive necessity, leaving the derivation self-contained.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Abstract only; no free parameters, axioms, or invented entities are mentioned or extractable.

pith-pipeline@v0.9.0 · 5340 in / 994 out tokens · 34717 ms · 2026-05-12T02:19:23.911882+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

32 extracted references · 32 canonical work pages · 1 internal anchor

  1. [1]

    Consciousness and Complexity

    Giulio Tononi and Gerald M. Edelman. “Consciousness and Complexity”. In:Science282.5395 (1998), pp. 1846–1851.DOI:10.1126/science.282.5395.1846. eprint:https: //www.science.org/doi/pdf/10.1126/science.282.5395.1846.URL: https : / / www . science . org / doi / abs / 10 . 1126 / science . 282 . 5395 . 1846

  2. [2]

    An information integration theory of consciousness

    Giulio Tononi. “An information integration theory of consciousness”. In:BMC Neuroscience 5 (2004), pp. 42–42.URL:https://api.semanticscholar.org/CorpusID: 6987007

  3. [3]

    Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework

    David Balduzzi and Giulio Tononi. “Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework”. In:PLOS Computational Biology4.6 (June 2008), pp. 1–18.DOI:10.1371/journal.pcbi.1000091.URL:https://doi.org/ 10.1371/journal.pcbi.1000091

  4. [4]

    OIZUMI, L.IS,ANDG

    Masafumi Oizumi, Larissa Albantakis, and Giulio Tononi. “From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0”. In:PLOS Computa- tional Biology10.5 (May 2014), pp. 1–25.DOI:10.1371/journal.pcbi.1003588. URL:https://doi.org/10.1371/journal.pcbi.1003588

  5. [5]

    Integrated information theory (IIT) 4.0: Formulating the properties of phenomenal existence in physical terms

    Larissa Albantakis et al. “Integrated information theory (IIT) 4.0: Formulating the properties of phenomenal existence in physical terms”. In:PLOS Computational Biology19.10 (Oct. 2023), pp. 1–45.DOI:10.1371/journal.pcbi.1011465.URL:https://doi. org/10.1371/journal.pcbi.1011465

  6. [6]

    On the non-uniqueness problem in integrated informa- tion theory

    Jake R Hanson and Sara I Walker. “On the non-uniqueness problem in integrated informa- tion theory”. In:Neuroscience of Consciousness2023.1 (June 2023), niad014.ISSN: 2057- 2107.DOI:10.1093/nc/niad014. eprint:https://academic.oup.com/nc/ article - pdf / 2023 / 1 / niad014 / 57145957 / niad014 . pdf.URL:https : //doi.org/10.1093/nc/niad014. 12

  7. [7]

    The strength of weak integrated information theory

    Pedro A.M. Mediano et al. “The strength of weak integrated information theory”. In:Trends in Cognitive Sciences26.8 (2022), pp. 646–655.ISSN: 1364-6613.DOI:https://doi. org/10.1016/j.tics.2022.04.008.URL:https://www.sciencedirect. com/science/article/pii/S1364661322000924

  8. [8]

    Complexity as Causal Information Integration

    Carlotta Langer and Nihat Ay. “Complexity as Causal Information Integration”. In:Entropy 22.10 (2020).ISSN: 1099-4300.DOI:10.3390/e22101107.URL:https://www. mdpi.com/1099-4300/22/10/1107

  9. [9]

    Nonnegative Decomposition of Multivariate Information

    Paul L. Williams and Randall D. Beer. “Nonnegative Decomposition of Multivariate Infor- mation”. In:CoRRabs/1004.2515 (2010). arXiv:1004.2515.URL:http://arxiv. org/abs/1004.2515

  10. [10]

    Virgil Griffith and Christof Koch.Quantifying synergistic mutual information. 2014. arXiv: 1205.4265 [cs.IT].URL:https://arxiv.org/abs/1205.4265

  11. [11]

    Morphological Computation: Syn- ergy of Body and Brain

    Keyan Ghazi-Zahedi, Carlotta Langer, and Nihat Ay. “Morphological Computation: Syn- ergy of Body and Brain”. In:Entropy19.9 (2017).ISSN: 1099-4300.DOI:10 . 3390 / e19090456.URL:https://www.mdpi.com/1099-4300/19/9/456

  12. [12]

    A measure for brain complexity: relating functional segregation and integration in the nervous system

    Giulio Tononi, Olaf Sporns, and Gerald M Edelman. “A measure for brain complexity: relating functional segregation and integration in the nervous system”. In:Proceedings of the National Academy of Sciences91.11 (1994), pp. 5033–5037.DOI:10.1073/pnas.91. 11.5033.URL:https://www.pnas.org/doi/10.1073/pnas.91.11.5033

  13. [13]

    William G. P. Mayner, William Marshall, and Giulio Tononi.Intrinsic cause-effect power: the tradeoff between differentiation and specification. 2025. arXiv:2510.03881 [q-bio.NC]. URL:https://arxiv.org/abs/2510.03881

  14. [14]

    A measure for intrinsic information

    Leonardo Barbosa et al. “A measure for intrinsic information”. In:Scientific Reports10 (Nov. 2020).DOI:10.1038/s41598-020-75943-4

  15. [15]

    Spivak.Category theory for scientists (Old version)

    David I. Spivak.Category theory for scientists (Old version). 2013. arXiv:1302 . 6946 [math.CT].URL:https://arxiv.org/abs/1302.6946

  16. [16]

    A Novel Approach to the Partial Information Decomposition

    Artemy Kolchinsky. “A Novel Approach to the Partial Information Decomposition”. In: Entropy24.3 (Mar. 2022), p. 403.ISSN: 1099-4300.DOI:10.3390/e24030403.URL: http://dx.doi.org/10.3390/e24030403

  17. [17]

    An operational information decomposition via synergistic disclo- sure

    Fernando Rosas et al. “An operational information decomposition via synergistic disclo- sure”. In:CoRRabs/2001.10387 (2020). arXiv:2001.10387.URL:https://arxiv. org/abs/2001.10387

  18. [18]

    Partial Information Decomposition via Defi- ciency for Multivariate Gaussians

    Gabriel Schamberg and Praveen Venkatesh. “Partial Information Decomposition via Defi- ciency for Multivariate Gaussians”. In:CoRRabs/2105.00769 (2021). arXiv:2105.00769. URL:https://arxiv.org/abs/2105.00769

  19. [19]

    BROJA-2PID: A robust estima- tor for Bertschinger et al.’s bivariate partial information decomposition

    Abdullah Makkeh, Dirk Oliver Theis, and Raul Vicente. “BROJA-2PID: A robust estima- tor for Bertschinger et al.’s bivariate partial information decomposition”. In:Entropy20.4 (2018), p. 271. 13

  20. [20]

    MAXENT3D_PID: An Estimator for the Maximum-Entropy Trivari- ate Partial Information Decomposition

    Abdullah Makkeh et al. “MAXENT3D_PID: An Estimator for the Maximum-Entropy Trivari- ate Partial Information Decomposition”. In:Entropy21.9 (2019).ISSN: 1099-4300.DOI: 10.3390/e21090862.URL:https://www.mdpi.com/1099- 4300/21/9/ 862

  21. [21]

    Abdallah and Mark D

    Samer A. Abdallah and Mark D. Plumbley.A measure of statistical complexity based on pre- dictive information. 2010. arXiv:1012.1890 [math.ST].URL:https://arxiv. org/abs/1012.1890

  22. [22]

    Why Does Space Feel the Way it Does? Towards a Prin- cipled Account of Spatial Experience

    Andrew Haun and Giulio Tononi. “Why Does Space Feel the Way it Does? Towards a Prin- cipled Account of Spatial Experience”. In:Entropy21.12 (2019).ISSN: 1099-4300.DOI: 10.3390/e21121160.URL:https://www.mdpi.com/1099- 4300/21/12/ 1160

  23. [23]

    Renzo Comolatti, Matteo Grasso, and Giulio Tononi.Why does time feel the way it does? To- wards a principled account of temporal experience. 2024. arXiv:2412.13198 [q-bio.NC]. URL:https://arxiv.org/abs/2412.13198

  24. [24]

    A Principled Infotheoreticφ-like Measure

    Virgil Griffith. “A Principled Infotheoreticφ-like Measure”. In:CoRRabs/1401.0978 (2014). arXiv:1401.0978.URL:http://arxiv.org/abs/1401.0978

  25. [25]

    An information theoretic approach to system differentiation on the basis of statistical dependencies between subsystems

    Jürgen Jost et al. “An information theoretic approach to system differentiation on the basis of statistical dependencies between subsystems”. In:Physica A: Statistical Mechanics and its Applications378.1 (Jan. 2007), pp. 1–10.DOI:10.1016/j.physa.2006.11.043. URL:https://ideas.repec.org/a/eee/phsmap/v378y2007i1p1- 10. html

  26. [26]

    Maps of random walks on complex networks reveal community structure

    Martin Rosvall and Carl T. Bergstrom. “Maps of random walks on complex networks reveal community structure”. In:Proceedings of the National Academy of Sciences105.4 (Jan. 2008), pp. 1118–1123.ISSN: 1091-6490.DOI:10 . 1073 / pnas . 0706851105.URL: http://dx.doi.org/10.1073/pnas.0706851105

  27. [27]

    Computational Mechanics: Pattern and Prediction, Structure and Simplicity

    Cosma Rohilla Shalizi and James P. Crutchfield. “Computational Mechanics: Pattern and Prediction, Structure and Simplicity”. In:Journal of Statistical Physics104.3–4 (Aug. 2001), pp. 817–879.ISSN: 1572-9613.DOI:10 . 1023 / a : 1010388907793.URL:http : //dx.doi.org/10.1023/A:1010388907793

  28. [28]

    Quantifying structure in networks

    E. Olbrich et al. “Quantifying structure in networks”. In:The European Physical Journal B77.2 (July 2010), pp. 239–247.ISSN: 1434-6036.DOI:10 . 1140 / epjb / e2010 - 00209-0.URL:http://dx.doi.org/10.1140/epjb/e2010-00209-0

  29. [29]

    Computing the Integrated Informa- tion of a Quantum Mechanism

    Larissa Albantakis, Robert Prentner, and Ian Durham. “Computing the Integrated Informa- tion of a Quantum Mechanism”. In:Entropy25.3 (Mar. 2023), p. 449.ISSN: 1099-4300. DOI:10.3390/e25030449.URL:http://dx.doi.org/10.3390/e25030449

  30. [30]

    PyPhi: A toolbox for integrated information theory

    William G. P. Mayner et al. “PyPhi: A toolbox for integrated information theory”. In:PLOS Computational Biology14.7 (July 2018). Ed. by Kim T. Blackwell, e1006343.ISSN: 1553- 7358.DOI:10.1371/journal.pcbi.1006343.URL:http://dx.doi.org/ 10.1371/journal.pcbi.1006343. 14 Appendix A Another Way of Computing the Union Distribution When computing the union info...

  31. [31]

    And until proven it is, it is unclear how to robustly numerically find the minimum of eq

    Foremost, ID is seemingly a non-convex function. And until proven it is, it is unclear how to robustly numerically find the minimum of eq. (19)

  32. [32]

    (6) is reasonably well studied in the Partial Informa- tion Decomposition literature

    The mathematical object ofp ∪ in eq. (6) is reasonably well studied in the Partial Informa- tion Decomposition literature. Therefore, unless there’s strong theoretical justification for preferringp 2 ∪, it’s sensible to err on the side of choosingp∪. That said, if the IIT world prefersp 2 ∪ overp ∪, further research into the minimization of eq. (19) could...