pith. machine review for the scientific record. sign in

arxiv: 2605.14916 · v1 · submitted 2026-05-14 · ❄️ cond-mat.dis-nn

Recognition: no theorem link

From Chaos to Synchrony in Recurrent Excitatory-Inhibitory Networks with Target-Specific Inhibition

Authors on Pith no claims yet

Pith reviewed 2026-05-15 02:52 UTC · model grok-4.3

classification ❄️ cond-mat.dis-nn
keywords recurrent networksexcitatory-inhibitory balancedynamical mean-field theoryasynchronous chaoscoherent oscillationstarget-specific inhibitionphase diagramneural dynamics
0
0 comments X

The pith

Target-specific inhibition organizes recurrent excitatory-inhibitory networks into three distinct dynamical regimes of quiescence, asynchronous chaos, or persistent activity with either synchronous chaos or coherent oscillations.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper extends the Sompolinsky-Crisanti-Sommers framework for random recurrent networks to two-population firing-rate models that include segregated excitatory and inhibitory cells plus inhibition targeted to specific connections, breaking excitation-inhibition balance. Dynamical mean-field theory yields self-consistent equations for mean activities and correlations, plus stability criteria that separate mean-driven from fluctuation-driven transitions. These criteria divide the phase diagram into inhibition-dominated or balanced networks, which remain either silent or asynchronously chaotic, and excitation-dominated networks, which sustain nonzero mean activity accompanied by either synchronous chaos or clean coherent oscillations. A central result is that the oscillatory regime suppresses chaotic fluctuations around the periodic trajectory rather than allowing them to coexist. This classification shows how a single wiring feature can select among qualitatively different collective states without external drive.

Core claim

In two-population recurrent networks with target-specific inhibitory couplings that break E-I balance, dynamical mean-field theory produces a phase diagram partitioned into three classes. Inhibition-dominated and strictly balanced networks exhibit only quiescence or asynchronous chaos. Excitation-dominated networks sustain persistent mean activity together with either synchronous chaos or coherent oscillations, with the distinction fixed by the eigenvalues of the stability matrix. Coherent oscillations emerge without coexisting chaotic fluctuations around the mean periodic trajectory; the oscillatory instability suppresses the chaotic component.

What carries the argument

The stability matrix obtained from the linearization of the two-population mean-field equations, whose eigenvalues determine whether the persistent-activity state loses stability via a fluctuation-driven route to synchronous chaos or a mean-driven route to coherent oscillations.

If this is right

  • Inhibition-dominated networks remain confined to low-activity or asynchronously chaotic states.
  • Excitation-dominated networks support persistent mean activity whose fluctuations are either irregular and synchronous or regular and oscillatory.
  • The transition to coherent oscillations eliminates chaotic variability around the mean trajectory.
  • Target-specific inhibition functions as a tunable control parameter that selects among the three regimes.
  • These transitions generalize the original SCS chaos transition to networks that possess explicit excitatory-inhibitory architecture.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same target-specific motif could be examined in anatomical data to predict whether a given circuit region operates in the chaotic or oscillatory regime.
  • Finite-size corrections or spiking implementations would test how microscopic noise interacts with the mean-field suppression of chaos by oscillations.
  • The mechanism offers a purely internal route to chaos suppression that might be compared with externally driven suppression in other driven oscillator systems.
  • Extensions to multiple inhibitory subtypes with different targeting rules could generate richer phase diagrams containing additional synchronized states.

Load-bearing premise

Dynamical mean-field theory in the large-N limit accurately captures the macroscopic statistics and stability criteria even for finite networks that use the chosen target-specific inhibitory couplings.

What would settle it

Direct numerical integration of finite but large networks whose connectivity follows the target-specific inhibitory rule; the simulations should show the same phase boundaries and the same absence of coexisting chaos plus oscillations as the mean-field predictions.

Figures

Figures reproduced from arXiv: 2605.14916 by Alessia Annibale, Carles Martorell, Miguel A. Mu\~noz, Rub\'en Calvo.

Figure 1
Figure 1. Figure 1: Nature of the leading eigenvalue of Nβ,δ —determining the type of resulting phase diagram— across the (β, δ) plane. The table on the right summarizes the classification of the leading eigenvalue into three distinct cases, denoted by S1, S2, and S3. The corresponding conditions associated to these cases split the (β, δ) parameter space into the three regions as shown in the left panel. In this panel, solid … view at source ↗
Figure 2
Figure 2. Figure 2: Phase diagrams and representative trajectories for (β, δ) = (0.50, 0.50) (point A in [PITH_FULL_IMAGE:figures/full_fig_p007_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Largest Lyapunov exponent across dynamical regimes. Heat map of the largest Lyapunov exponent (LLE) Λ for a network of size N = 1000 with (β, δ) = (0.5, 0.5) (panel A) and (β, δ) = (0.5, 0.25) (panel B) plotted as a function of the rescaled parameters ((J0/J) ∗ , (1/gJ) ∗ ). Positive values of Λ identify chaotic regimes, while oscillatory activity is linked with a zero value. The LLE has been averaged over… view at source ↗
Figure 4
Figure 4. Figure 4: Oscillatory regimes and transition routes to the COA phase. Panel A: Time-averaged Kuramoto parameter Ry of the inhibitory population as a function of the rescaled parameters ((J0/J) ∗ , (1/gJ) ∗ ), indicating the degree of global phase synchronization. Panels B–D: Rep￾resentative dynamical states corresponding to colored pentagons in panel A. These panels show the excitatory (red) and inhibitory (blue) ac… view at source ↗
Figure 1
Figure 1. Figure 1: Distribution of excitatory synaptic weights for an [PITH_FULL_IMAGE:figures/full_fig_p025_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Nature of the outlier eigenvalues of Nβ,δ in the (β, δ) plane. Three regimes arise depending on the real part of the leading outlier eigenvalue (see [PITH_FULL_IMAGE:figures/full_fig_p032_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Phase diagrams and representative trajectories for (β, δ) = (0.25, 0.50) (point A in Fig.2; panels A–E), (β, δ) = (1.00, 0.50) (point B in Fig.2; panels F–J) and (β, δ) = (1.50, 1.25) (point C in Fig.2; panels K–O) . Panels A,B,F,G, K, L show heat maps of the time-averaged mean inhibitory activity Mˆy and the equal-time autocorrelation Cˆy(0). Solid black curves denote bifurcation lines obtained from fixed… view at source ↗
read the original abstract

Biological neural networks can operate in qualitatively distinct dynamical regimes, and transitions between these regimes are thought to underlie changes in computation and behavior. The seminal work of Sompolinsky, Crisanti, and Sommers (SCS) showed that random recurrent networks undergo a transition from quiescence to asynchronous chaos, establishing a paradigmatic link between random connectivity, dynamical instability, and internally generated fluctuations in neural circuits. Here, we extend this framework to two-population firing-rate networks with segregated excitatory and inhibitory neurons and target-specific inhibitory couplings that break excitation--inhibition balance. Using dynamical mean-field theory, we derive self-consistent equations for the macroscopic mean activities and autocorrelations, together with stability criteria distinguishing mean-driven and fluctuation-driven instabilities. We show that target-specific inhibition organizes the phase diagram into three qualitative classes: inhibition-dominated or strictly balanced networks display only quiescent activity and asynchronous chaos; excitation-dominated networks display persistent activity together with either synchronous chaos with non-vanishing mean activity or coherent oscillations, depending on the stability-matrix eigenvalues. Crucially, coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory; rather, their onset suppresses the chaotic component, reminiscent of input-induced suppression of chaos. These results generalize SCS theory to recurrent networks with explicit excitatory--inhibitory structure and identify target-specific inhibition as a key control parameter for large-scale neural dynamics.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript extends the Sompolinsky-Crisanti-Sommers (SCS) theory to two-population excitatory-inhibitory firing-rate networks with target-specific inhibitory couplings that break E-I balance. Using dynamical mean-field theory, it derives self-consistent equations for macroscopic mean activities and autocorrelations, together with stability criteria based on eigenvalues of a stability matrix. The phase diagram is organized into three classes: inhibition-dominated or strictly balanced networks show only quiescence or asynchronous chaos; excitation-dominated networks exhibit persistent activity with either synchronous chaos (non-vanishing mean) or coherent oscillations, where the latter onset suppresses chaotic fluctuations around the periodic mean trajectory.

Significance. If the central claims hold, this work significantly generalizes the SCS paradigm to biologically structured E-I circuits by identifying target-specific inhibition as a control parameter for transitions between chaotic and synchronous regimes. The DMFT derivation of parameter-free self-consistent equations for means and autocorrelations, combined with explicit stability criteria, provides analytical insight into chaos suppression by oscillations (reminiscent of input-induced suppression), which is a strength for the field.

major comments (2)
  1. [DMFT derivation and stability criteria] The self-consistent DMFT equations for autocorrelations (as described in the abstract and stability analysis) assume Gaussian fluctuation statistics remain closed in the large-N limit. When target-specific inhibition breaks E-I balance and produces nonzero mean activity, this may allow higher-order correlations to survive, potentially permitting residual chaotic attractors around the periodic orbit that are missed by the linear eigenvalue analysis; this assumption is load-bearing for the no-coexistence claim.
  2. [Stability analysis (eigenvalues of stability matrix)] The conclusion that 'coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory' (abstract) follows from the stability-matrix eigenvalues distinguishing mean-driven vs. fluctuation-driven instabilities, but lacks explicit verification (e.g., numerical solution of the DMFT equations or checks for multistability) when mean activity is nonzero; this is central to the three-class phase diagram.
minor comments (2)
  1. [Abstract] The abstract introduces 'stability-matrix eigenvalues' without a brief definition or equation reference; adding one inline would improve accessibility.
  2. [Phase diagram description] The phase-diagram classification would benefit from a schematic figure mapping the three regimes against parameters such as inhibition strength and target-specificity to make the boundaries concrete.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their detailed and constructive review of our manuscript. We address the major comments below, providing clarifications on the DMFT framework and adding supporting numerical evidence where appropriate.

read point-by-point responses
  1. Referee: [DMFT derivation and stability criteria] The self-consistent DMFT equations for autocorrelations (as described in the abstract and stability analysis) assume Gaussian fluctuation statistics remain closed in the large-N limit. When target-specific inhibition breaks E-I balance and produces nonzero mean activity, this may allow higher-order correlations to survive, potentially permitting residual chaotic attractors around the periodic orbit that are missed by the linear eigenvalue analysis; this assumption is load-bearing for the no-coexistence claim.

    Authors: In the large-N limit, the synaptic input to each neuron consists of a sum over many independent random connections. By the central limit theorem, the distribution of these inputs converges to a Gaussian, regardless of whether the mean activity is zero or nonzero. This justifies the closure of the DMFT equations at the level of means and autocorrelations. Higher-order correlations are suppressed as 1/N and do not affect the macroscopic dynamics. The stability analysis of the fluctuation equations around the periodic orbit confirms that chaotic fluctuations are suppressed when the mean-driven oscillatory instability occurs first. We have added a paragraph in the Methods section discussing the validity of the Gaussian assumption in the unbalanced regime. revision: partial

  2. Referee: [Stability analysis (eigenvalues of stability matrix)] The conclusion that 'coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory' (abstract) follows from the stability-matrix eigenvalues distinguishing mean-driven vs. fluctuation-driven instabilities, but lacks explicit verification (e.g., numerical solution of the DMFT equations or checks for multistability) when mean activity is nonzero; this is central to the three-class phase diagram.

    Authors: We acknowledge that direct numerical verification strengthens the claim. In the revised manuscript, we have included numerical solutions of the self-consistent DMFT equations for the mean activities and autocorrelations in the excitation-dominated regime. These simulations show that in the parameter region where the oscillatory solution is stable according to the eigenvalue analysis, the autocorrelation functions exhibit periodic behavior without additional chaotic decay components. Furthermore, we performed checks by varying initial conditions and found no indications of multistability or coexisting chaotic attractors. This supports the three-class phase diagram and the suppression of chaos by oscillations. revision: yes

Circularity Check

0 steps flagged

DMFT derivation self-contained with only non-load-bearing self-citation

full rationale

The self-consistent equations for macroscopic means and autocorrelations, together with the stability-matrix eigenvalue criteria, are obtained directly from the large-N limit of the two-population rate equations via dynamical mean-field theory. No step reduces a claimed prediction to a fitted input or to a self-citation whose content is itself the target result; the organization of the phase diagram into the three classes follows from the explicit target-specific inhibitory terms without circular redefinition. Any prior citations (including to SCS) supply independent background rather than load-bearing uniqueness theorems.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claims rest on the applicability of dynamical mean-field theory to the large-N limit and the modeling choice of target-specific inhibitory couplings; no free parameters or invented entities are introduced in the abstract.

axioms (1)
  • domain assumption Dynamical mean-field theory accurately describes the macroscopic mean activities and autocorrelations in the thermodynamic limit
    Invoked to derive the self-consistent equations and stability criteria

pith-pipeline@v0.9.0 · 5562 in / 1303 out tokens · 42578 ms · 2026-05-15T02:52:07.733375+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

79 extracted references · 79 canonical work pages · 2 internal anchors

  1. [1]

    E. R. Kandel (Ed.), Principles of neural science, 4th Edition, McGraw-Hill, New York, NY , 2000

  2. [2]

    Dayan, L

    P. Dayan, L. F. Abbott, Theoretical Neuroscience: Computational and Mathematical Modeling of Neu- ral Systems, MIT Press, 2005

  3. [3]

    Gerstner, W

    W. Gerstner, W. M. Kistler, R. Naud, L. Panin- ski, Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition, 1st Edition, Cambridge University Press, 2014. doi:10.1017/CBO9781107447615. URLhttps://www.cambridge.org/core/ product/identifier/9781107447615/type/ book

  4. [4]

    E. M. Izhikevich, Dynamical systems in neuro- science: the geometry of excitability and burst- ing, Computational neuroscience, MIT Press, Cam- bridge, Mass, 2007

  5. [5]

    Stringer, M

    C. Stringer, M. Pachitariu, N. Steinmetz, M. Carandini, K. D. Harris, High-dimensional geometry of population responses in visual cortex, Nature 571 (7765) (2019) 361–365. doi:10.1038/s41586-019-1346-5. URLhttps://www.nature.com/articles/ s41586-019-1346-5

  6. [6]

    Sompolinsky, A

    H. Sompolinsky, A. Crisanti, H. J. Sommers, Chaos in Random Neural Networks, Physi- cal Review Letters 61 (3) (1988) 259–262. doi:10.1103/PhysRevLett.61.259. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.61.259

  7. [7]

    Harish, D

    O. Harish, D. Hansel, Asynchronous Rate Chaos in Spiking Neuronal Circuits, PLOS Com- putational Biology 11 (7) (2015) e1004266. doi:10.1371/journal.pcbi.1004266. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1004266

  8. [8]

    Dahmen, S

    D. Dahmen, S. Grün, M. Diesmann, M. Helias, Second type of criticality in the brain uncovers rich multiple-neuron dynamics, Proceedings of the National Academy of Sciences 116 (26) (2019) 13051–13060.doi:10.1073/pnas.1818972116. URLhttp://www.pnas.org/lookup/doi/10. 1073/pnas.1818972116

  9. [9]

    J. Li, W. L. Shew, Tuning network dynamics from criticality to an asynchronous state, PLOS Computational Biology 16 (9) (2020) e1008268. doi:10.1371/journal.pcbi.1008268. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1008268

  10. [10]

    Martorell, R

    C. Martorell, R. Calvo, A. Roig, A. Annibale, M. A. Muñoz, Ergodicity Breaking and High-Dimensional Chaos in Random Recurrent Networks (Oct. 2025). doi:10.48550/arXiv.2510.07932. URLhttp://arxiv.org/abs/2510.07932 19

  11. [11]

    Transition to chaos in random neuronal networks

    J. Kadmon, H. Sompolinsky, Transition to chaos in random neuronal networks, Physical Review X 5 (4) (2015) 041030.doi:10.1103/PhysRevX.5. 041030. URLhttp://arxiv.org/abs/1508.06486

  12. [12]

    Schuecker, S

    J. Schuecker, S. Goedeke, M. Helias, Optimal Sequence Memory in Driven Random Net- works, Physical Review X 8 (4) (2018) 041029. doi:10.1103/PhysRevX.8.041029. URLhttps://link.aps.org/doi/10.1103/ PhysRevX.8.041029

  13. [13]

    Crisanti, H

    A. Crisanti, H. Sompolinsky, Path Integral Approach to Random Neural Networks, Physical Review E 98 (6) (2018) 062120.doi:10.1103/PhysRevE. 98.062120

  14. [14]

    I. D. Landau, H. Sompolinsky, Coherent chaos in a recurrent neural network with structured connectivity, PLOS Computa- tional Biology 14 (12) (2018) e1006309. doi:10.1371/journal.pcbi.1006309. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1006309

  15. [15]

    M. Dick, A. Van Meegen, M. Helias, Link- ing network- and neuron-level correla- tions by renormalized field theory, Physi- cal Review Research 6 (3) (2024) 033264. doi:10.1103/PhysRevResearch.6.033264. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.6.033264

  16. [16]

    Martorell, R

    C. Martorell, R. Calvo, A. Annibale, M. A. Muñoz, Dynamically selected steady states and criticality in non-reciprocal networks, Chaos, Solitons & Fractals 182 (2024) 114809. doi:10.1016/j.chaos.2024.114809. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0960077924003618

  17. [17]

    Kadmon, Efficient Coding with Chaotic Neu- ral Networks: A Journey From Neuroscience to Physics and Back, Human Arenas (Jun

    J. Kadmon, Efficient Coding with Chaotic Neu- ral Networks: A Journey From Neuroscience to Physics and Back, Human Arenas (Jun. 2025). doi:10.1007/s42087-025-00507-9. URLhttps://link.springer.com/10.1007/ s42087-025-00507-9

  18. [18]

    Sherrington, S

    D. Sherrington, S. Kirkpatrick, Solv- able Model of a Spin-Glass, Physical Re- view Letters 35 (26) (1975) 1792–1796. doi:10.1103/PhysRevLett.35.1792. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.35.1792

  19. [19]

    H. Nishimori, Statistical Physics of Spin Glasses and Information Processing: An In- troduction, 1st Edition, Oxford University PressOxford, 2001.doi:10.1093/acprof: oso/9780198509417.001.0001. URLhttps://academic.oup.com/book/5185

  20. [20]

    S. J. Fournier, P. Urbani, High-dimensional dynam- ical systems: co-existence of attractors, phase tran- sitions, maximal Lyapunov exponent and response to periodic drive (Nov. 2025).doi:10.48550/ arXiv.2511.09679. URLhttp://arxiv.org/abs/2511.09679

  21. [21]

    S. J. Fournier, A. Pacco, V . Ros, P. Urbani, Non- reciprocal interactions and high-dimensional chaos: comparing dynamics and statistics of equilibria in a solvable model (Mar. 2025).doi:10.48550/ arXiv.2503.20908. URLhttp://arxiv.org/abs/2503.20908

  22. [22]

    van Vreeswijk, H

    C. van Vreeswijk, H. Sompolinsky, Chaos in Neu- ronal Networks with Balanced Excitatory and In- hibitory Activity, Science 274 (5293) (1996) 1724– 1726.doi:10.1126/science.274.5293.1724. URLhttps://www.science.org/doi/10. 1126/science.274.5293.1724

  23. [23]

    C. V . Vreeswijk, H. Sompolinsky, Chaotic Bal- anced State in a Model of Cortical Circuits, Neural Computation 10 (6) (1998) 1321–1371. doi:10.1162/089976698300017214. URLhttps://direct.mit.edu/neco/ article/10/6/1321-1371/6179

  24. [24]

    Brunel, Dynamics of Sparsely Connected Net- works of Excitatory and Inhibitory Spiking Neurons, Journal of Computational Neuroscience 8 (2000) 183–208.doi:10.1023/a:1008925309027

    N. Brunel, Dynamics of Sparsely Connected Net- works of Excitatory and Inhibitory Spiking Neurons, Journal of Computational Neuroscience 8 (2000) 183–208.doi:10.1023/a:1008925309027

  25. [25]

    Renart, J

    A. Renart, J. De La Rocha, P. Bartho, L. Hollender, N. Parga, A. Reyes, K. D. Harris, The Asynchronous 20 State in Cortical Circuits, Science 327 (5965) (2010) 587–590.doi:10.1126/science.1179850. URLhttps://www.science.org/doi/10. 1126/science.1179850

  26. [26]

    Buendía, P

    V . Buendía, P. Villegas, S. di Santo, A. Vezzani, R. Burioni, M. A. Muñoz, Jensen’s force and the statistical mechanics of cortical asynchronous states, Scientific Reports 9 (1) (2019) 15183. doi:10.1038/s41598-019-51520-2. URLhttps://www.nature.com/articles/ s41598-019-51520-2

  27. [27]

    Mastrogiuseppe, S

    F. Mastrogiuseppe, S. Ostojic, Intrinsically- generated fluctuating activity in excitatory- inhibitory networks, PLOS Computa- tional Biology 13 (4) (2017) e1005498. doi:10.1371/journal.pcbi.1005498. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1005498

  28. [28]

    Mastrogiuseppe, S

    F. Mastrogiuseppe, S. Ostojic, Linking Connectivity, Dynamics, and Computations in Low-Rank Recur- rent Neural Networks, Neuron 99 (3) (2018) 609– 623.e29.doi:10.1016/j.neuron.2018.07.003. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0896627318305439

  29. [29]

    Schuessler, A

    F. Schuessler, A. Dubreuil, F. Mastrogiuseppe, S. Ostojic, O. Barak, Dynamics of random recur- rent networks with correlated low-rank structure, Physical Review Research 2 (1) (2020) 013111. doi:10.1103/PhysRevResearch.2.013111. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.2.013111

  30. [30]

    Beiran, A

    M. Beiran, A. Dubreuil, A. Valente, F. Mas- trogiuseppe, S. Ostojic, Shaping Dynamics With Multiple Populations in Low-Rank Recurrent Networks, Neural Computation 33 (6) (2021) 1572–1615.doi:10.1162/neco_a_01381. URLhttps://doi.org/10.1162/neco_a_ 01381

  31. [31]

    Dubreuil, A

    A. Dubreuil, A. Valente, M. Beiran, F. Mas- trogiuseppe, S. Ostojic, The role of population structure in computations through neural dynamics, Nature Neuroscience 25 (6) (2022) 783–794. doi:10.1038/s41593-022-01088-4. URLhttps://www.nature.com/articles/ s41593-022-01088-4

  32. [32]

    L. C. García Del Molino, K. Pakdaman, J. Touboul, G. Wainrib, Synchronization in random balanced networks, Physical Review E 88 (4) (2013) 042824. doi:10.1103/PhysRevE.88.042824. URLhttps://link.aps.org/doi/10.1103/ PhysRevE.88.042824

  33. [33]

    Kepecs, G

    A. Kepecs, G. Fishell, Interneuron cell types are fit to function, Nature 505 (7483) (2014) 318–326. doi:10.1038/nature12983. URLhttps://www.nature.com/articles/ nature12983

  34. [34]

    Tremblay, S

    R. Tremblay, S. Lee, B. Rudy, GABAergic Interneu- rons in the Neocortex: From Cellular Properties to Circuits, Neuron 91 (2) (2016) 260–292. doi:10.1016/j.neuron.2016.06.033. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0896627316303117

  35. [35]

    Corral López, V

    R. Corral López, V . Buendía, M. A. Muñoz, Excitatory-inhibitory branching process: A parsimonious view of cortical asynchronous states, excitability, and criticality, Physi- cal Review Research 4 (4) (2022) L042027. doi:10.1103/PhysRevResearch.4.L042027. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.4.L042027

  36. [36]

    Dayan, L

    P. Dayan, L. F. Abbott, Network models, Ch 7, in: Theoretical neuroscience: computational and math- ematical modeling of neural systems, first paperback ed Edition, Computational neuroscience, MIT Press, Cambridge, Mass., 2005

  37. [37]

    T. P. V ogels, K. Rajan, L. Abbott, NEURAL NETWORK DYNAMICS, Annual Review of Neuroscience 28 (1) (2005) 357–376.doi: 10.1146/annurev.neuro.28.061604.135637. URLhttps://www.annualreviews.org/doi/ 10.1146/annurev.neuro.28.061604.135637

  38. [38]

    Martorell, Supplemental Material

    C. Martorell, Supplemental Material. 21

  39. [39]

    F. L. Metz, Dynamical Mean-Field Theory of Complex Systems on Sparse Directed Networks, Physical Review Letters 134 (3) (2025) 037401. doi:10.1103/PhysRevLett.134.037401. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.134.037401

  40. [40]

    Benettin, L

    G. Benettin, L. Galgani, A. Giorgilli, J.-M. Strelcyn, Lyapunov Characteristic Exponents for smooth dynamical systems and for hamiltonian systems; A method for computing all of them. Part 2: Numer- ical application, Meccanica 15 (1) (1980) 21–30. doi:10.1007/BF02128237. URLhttp://link.springer.com/10.1007/ BF02128237

  41. [41]

    A. Wolf, J. B. Swift, H. L. Swinney, J. A. Vastano, Determining Lyapunov exponents from a time series, Physica D: Nonlin- ear Phenomena 16 (3) (1985) 285–317. doi:10.1016/0167-2789(85)90011-9. URLhttps://linkinghub.elsevier.com/ retrieve/pii/0167278985900119

  42. [42]

    Pikovsky, A

    A. Pikovsky, A. Politi, Lyapunov exponents: a tool to explore complex dynamics, Cambridge Univer- sity Press, Cambridge, 2016

  43. [43]

    Rajan, L

    K. Rajan, L. F. Abbott, H. Sompolinsky, Stimulus- dependent suppression of chaos in recurrent neural networks, Physical Review E 82 (1) (2010) 011903. doi:10.1103/PhysRevE.82.011903. URLhttps://link.aps.org/doi/10.1103/ PhysRevE.82.011903

  44. [44]

    S. H. Strogatz, Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering, 2nd Edition, CRC Press, Boca Ra- ton, 2019.doi:10.1201/9780429492563

  45. [45]

    Guckenheimer, P

    J. Guckenheimer, P. Holmes, Nonlinear oscillations, dynamical systems, and bifurcations of vector fields, corrected seventh printing Edition, no. 42 in Applied mathematical sciences, Springer Science+Business Media, New York, NY , 2002

  46. [46]

    Cohen, Time frequency analysis, Prentice Hall signal processing series, Prentice Hall, Englewood Cliffs, NJ, 1995

    L. Cohen, Time frequency analysis, Prentice Hall signal processing series, Prentice Hall, Englewood Cliffs, NJ, 1995

  47. [47]

    J. M. Beggs, D. Plenz, Neuronal Avalanches in Neocortical Circuits, Journal of Neu- roscience 23 (35) (2003) 11167–11177. doi:10.1523/JNEUROSCI.23-35-11167.2003. URLhttps://www.jneurosci.org/content/ 23/35/11167

  48. [48]

    J. M. Beggs, The criticality hypothesis: how lo- cal cortical networks might optimize information processing, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineer- ing Sciences 366 (1864) (2008) 329–343.doi: 10.1098/rsta.2007.2092. URLhttps://royalsocietypublishing.org/ doi/10.1098/rsta.2007.2092

  49. [49]

    W. L. Shew, D. Plenz, The Functional Ben- efits of Criticality in the Cortex, The Neu- roscientist 19 (1) (2013) 88–100.doi: 10.1177/1073858412445487. URLhttps://doi.org/10.1177/ 1073858412445487

  50. [50]

    T. Mora, W. Bialek, Are Biological Sys- tems Poised at Criticality?, Journal of Sta- tistical Physics 144 (2) (2011) 268–302. doi:10.1007/s10955-011-0229-4. URLhttps://doi.org/10.1007/ s10955-011-0229-4

  51. [51]

    M. A. Muñoz, Colloquium: Criticality and dynamical scaling in living systems, Reviews of Modern Physics 90 (031001) (Jul. 2018). doi:10.1103/RevModPhys.90.031001. URLhttps://link.aps.org/doi/10.1103/ RevModPhys.90.031001

  52. [52]

    de Arcangelis, C

    L. de Arcangelis, C. Perrone-Capano, H. J. Herrmann, Self-Organized Criti- cality Model for Brain Plasticity, Physi- cal Review Letters 96 (2) (2006) 028107. doi:10.1103/PhysRevLett.96.028107. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.96.028107

  53. [53]

    G. Deco, M. L. Kringelbach, V . K. Jirsa, P. Rit- ter, The dynamics of resting fluctuations in the 22 brain: metastability and its dynamical corti- cal core, Scientific Reports 7 (1) (2017) 3095. doi:10.1038/s41598-017-03073-5. URLhttps://www.nature.com/articles/ s41598-017-03073-5

  54. [54]

    Wilting, V

    J. Wilting, V . Priesemann, 25 years of criti- cality in neuroscience — established results, open controversies, novel concepts, Current Opinion in Neurobiology 58 (2019) 105–111. doi:10.1016/j.conb.2019.08.002. URLhttps://www.sciencedirect.com/ science/article/pii/S0959438819300248

  55. [55]

    O’Byrne, K

    J. O’Byrne, K. Jerbi, How critical is brain critical- ity?, Trends in Neurosciences 45 (11) (2022) 820– 837.doi:10.1016/j.tins.2022.08.007

  56. [56]

    K. B. Hengen, W. L. Shew, Is criticality a uni- fied setpoint of brain function?, Neuron 113 (16) (2025) 2582–2598.e2.doi:10.1016/j.neuron. 2025.05.020

  57. [57]

    A review on brain tumor segmentation based on deep learning methods with federated learning techniques

    L. Cocchi, L. L. Gollo, A. Zalesky, M. Breaks- pear, Criticality in the brain: A synthesis of neu- robiology, models and cognition, Progress in Neu- robiology 158 (2017) 132–152.doi:10.1016/j. pneurobio.2017.07.002

  58. [58]

    T. Tao, V . Vu, M. Krishnapur, Random matri- ces: Universality of ESDs and the circular law, arXiv:0807.4898 [math] (Apr. 2009). URLhttp://arxiv.org/abs/0807.4898

  59. [59]

    Tao, Outliers in the spectrum of iid matrices with bounded rank perturbations, Probability Theory and Related Fields 155 (1-2) (2013) 231–263

    T. Tao, Outliers in the spectrum of iid matrices with bounded rank perturbations, Probability Theory and Related Fields 155 (1-2) (2013) 231–263. doi:10.1007/s00440-011-0397-9. URLhttps://link.springer.com/10.1007/ s00440-011-0397-9

  60. [60]

    Knowles, J

    A. Knowles, J. Yin, The outliers of a deformed Wigner matrix, The Annals of Probability 42 (5) (Sep. 2014).doi:10.1214/13-AOP855. URLhttps://projecteuclid.org/ journals/annals-of-probability/ volume-42/issue-5/ The-outliers-of-a-deformed-Wigner-matrix/ 10.1214/13-AOP855.full

  61. [61]

    Rajan, L

    K. Rajan, L. F. Abbott, Eigenvalue Spectra of Random Matrices for Neural Networks, Phys- ical Review Letters 97 (18) (2006) 188104. doi:10.1103/PhysRevLett.97.188104. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.97.188104

  62. [62]

    F. Roy, G. Biroli, G. Bunin, C. Cammarota, Nu- merical implementation of dynamical mean field theory for disordered systems: application to the Lotka–V olterra model of ecosystems, Jour- nal of Physics A: Mathematical and Theoreti- cal 52 (48) (2019) 484001.doi:10.1088/ 1751-8121/ab1f32. URLhttps://iopscience.iop.org/article/ 10.1088/1751-8121/ab1f32

  63. [63]

    W. Zou, H. Huang, Introduction to dynamical mean- field theory of randomly connected neural networks with bidirectionally correlated couplings, SciPost Physics Lecture Notes (2024) 79doi:10.21468/ SciPostPhysLectNotes.79. URLhttp://arxiv.org/abs/2305.08459 23 Supplemental Material Carles Martorell, Rub´ en Calvo, Alessia Annibale and Miguel ´Angel Mu˜ noz...

  64. [64]

    Transitions from the Quiescent State 12

  65. [65]

    Transition from the Asynchronous Chaos 13

  66. [66]

    Stability of the Response-Response Function 13

    Transition from the PA to COA 13 C. Stability of the Response-Response Function 13

  67. [67]

    Transition from PA to SC 14 References 14 I. EXCIT A TOR Y AND INHIBITOR Y FIRING RA TE SYSTEM The general system has two populations of, respectively,fNexcitatory (E) and (1−f)Ninhibitory (I) neurons, beingfthe fraction of the excitatory population. By denotingx i(t) (i= 1,...,fN) andy k(t) (k=fN+ 1,...,N) the time-dependent rate of excitatory and inhibi...

  68. [68]

    Transitions from the Quiescent State For fixed points, such thatC(τ)≡q, for anyτ, the Jacobian matrixJreduces to simple equations. When the quiescent solution is considered, described byM= 0 andq= 0, the Jacobian matrix reduces to a diagonal-block: JMQ =J QM = 0, while JMM =gJ 0Nβ,δ, J QQ = (gJ) 2Mβ,δ.(67) 13 The eigenvalues of these matrices are, respect...

  69. [69]

    Therefore, the stability criterion is characterized by the spectrum ofJ MM−Id such that zx =gJ √ (Cx)c 0 +β2(Cy)c 0z zy =gJ √ (Cx)c 0 +δ2(Cy)c 0z

    Transition from the Asynchronous Chaos The transition from AC phase to SC or COA phase emerges as a bifurcation of the mean activity,M; therefore, the transition can be computed by analyzing the stability of the AC state, described byM= 0 andC(0) =C c 0, against a small perturbation, whereC c 0 describes the zero-lag autocorrelator selected by the system ...

  70. [70]

    Hence, an identical analysis to that done for the AC-COA/SC can be performed obtaining the same criterion, but assuming fixed-point solutions:MandC(0) =q

    Transition from the PA to COA The transition from PA to COA phase also appears as a bifurcation of the mean activity,M. Hence, an identical analysis to that done for the AC-COA/SC can be performed obtaining the same criterion, but assuming fixed-point solutions:MandC(0) =q. C. Stability of the Response-Response F unction The phase transition from PA to SC...

  71. [71]

    Transition from PA to SC The transition from PA to SC phase is hence described by the second-order response of the autocorrelation. As- suming that the (unperturbed) state is a fixed point,M̸= 0 andq̸= 0, the response against perturbations of the autocorrelator is governed by matrix: (gJ) 2Dϕ′,ϕ′Mβ,δ−Id.(78) where [ Dϕ′,ϕ′ ] αβ= ⟨ ϕ′(zα)2⟩ δαβ (79) The st...

  72. [72]

    Kadmon and H

    J. Kadmon and H. Sompolinsky, Transition to chaos in random neuronal networks, Physical Review X5, 041030 (2015)

  73. [73]

    Harish and D

    O. Harish and D. Hansel, Asynchronous Rate Chaos in Spiking Neuronal Circuits, PLOS Computational Biology11, e1004266 (2015)

  74. [74]

    Mastrogiuseppe and S

    F. Mastrogiuseppe and S. Ostojic, Intrinsically-generated fluctuating activity in excitatory-inhibitory networks, PLOS Com- putational Biology13, e1005498 (2017). 15

  75. [75]

    Sompolinsky, A

    H. Sompolinsky, A. Crisanti, and H. J. Sommers, Chaos in Random Neural Networks, Physical Review Letters61, 259 (1988)

  76. [76]

    Schuecker, S

    J. Schuecker, S. Goedeke, and M. Helias, Optimal Sequence Memory in Driven Random Networks, Physical Review X8, 041029 (2018)

  77. [77]

    Helias and D

    M. Helias and D. Dahmen,Statistical Field Theory for Neural Networks, Lecture Notes in Physics, Vol. 970 (Springer International Publishing, Cham, 2020)

  78. [78]

    Mastrogiuseppe and S

    F. Mastrogiuseppe and S. Ostojic, Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks, Neuron99, 609 (2018)

  79. [79]

    Martorell, R

    C. Martorell, R. Calvo, A. Roig, A. Annibale, and M. A. Mu˜ noz, Ergodicity Breaking and High-Dimensional Chaos in Random Recurrent Networks (2025)