Recognition: no theorem link
From Chaos to Synchrony in Recurrent Excitatory-Inhibitory Networks with Target-Specific Inhibition
Pith reviewed 2026-05-15 02:52 UTC · model grok-4.3
The pith
Target-specific inhibition organizes recurrent excitatory-inhibitory networks into three distinct dynamical regimes of quiescence, asynchronous chaos, or persistent activity with either synchronous chaos or coherent oscillations.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
In two-population recurrent networks with target-specific inhibitory couplings that break E-I balance, dynamical mean-field theory produces a phase diagram partitioned into three classes. Inhibition-dominated and strictly balanced networks exhibit only quiescence or asynchronous chaos. Excitation-dominated networks sustain persistent mean activity together with either synchronous chaos or coherent oscillations, with the distinction fixed by the eigenvalues of the stability matrix. Coherent oscillations emerge without coexisting chaotic fluctuations around the mean periodic trajectory; the oscillatory instability suppresses the chaotic component.
What carries the argument
The stability matrix obtained from the linearization of the two-population mean-field equations, whose eigenvalues determine whether the persistent-activity state loses stability via a fluctuation-driven route to synchronous chaos or a mean-driven route to coherent oscillations.
If this is right
- Inhibition-dominated networks remain confined to low-activity or asynchronously chaotic states.
- Excitation-dominated networks support persistent mean activity whose fluctuations are either irregular and synchronous or regular and oscillatory.
- The transition to coherent oscillations eliminates chaotic variability around the mean trajectory.
- Target-specific inhibition functions as a tunable control parameter that selects among the three regimes.
- These transitions generalize the original SCS chaos transition to networks that possess explicit excitatory-inhibitory architecture.
Where Pith is reading between the lines
- The same target-specific motif could be examined in anatomical data to predict whether a given circuit region operates in the chaotic or oscillatory regime.
- Finite-size corrections or spiking implementations would test how microscopic noise interacts with the mean-field suppression of chaos by oscillations.
- The mechanism offers a purely internal route to chaos suppression that might be compared with externally driven suppression in other driven oscillator systems.
- Extensions to multiple inhibitory subtypes with different targeting rules could generate richer phase diagrams containing additional synchronized states.
Load-bearing premise
Dynamical mean-field theory in the large-N limit accurately captures the macroscopic statistics and stability criteria even for finite networks that use the chosen target-specific inhibitory couplings.
What would settle it
Direct numerical integration of finite but large networks whose connectivity follows the target-specific inhibitory rule; the simulations should show the same phase boundaries and the same absence of coexisting chaos plus oscillations as the mean-field predictions.
Figures
read the original abstract
Biological neural networks can operate in qualitatively distinct dynamical regimes, and transitions between these regimes are thought to underlie changes in computation and behavior. The seminal work of Sompolinsky, Crisanti, and Sommers (SCS) showed that random recurrent networks undergo a transition from quiescence to asynchronous chaos, establishing a paradigmatic link between random connectivity, dynamical instability, and internally generated fluctuations in neural circuits. Here, we extend this framework to two-population firing-rate networks with segregated excitatory and inhibitory neurons and target-specific inhibitory couplings that break excitation--inhibition balance. Using dynamical mean-field theory, we derive self-consistent equations for the macroscopic mean activities and autocorrelations, together with stability criteria distinguishing mean-driven and fluctuation-driven instabilities. We show that target-specific inhibition organizes the phase diagram into three qualitative classes: inhibition-dominated or strictly balanced networks display only quiescent activity and asynchronous chaos; excitation-dominated networks display persistent activity together with either synchronous chaos with non-vanishing mean activity or coherent oscillations, depending on the stability-matrix eigenvalues. Crucially, coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory; rather, their onset suppresses the chaotic component, reminiscent of input-induced suppression of chaos. These results generalize SCS theory to recurrent networks with explicit excitatory--inhibitory structure and identify target-specific inhibition as a key control parameter for large-scale neural dynamics.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript extends the Sompolinsky-Crisanti-Sommers (SCS) theory to two-population excitatory-inhibitory firing-rate networks with target-specific inhibitory couplings that break E-I balance. Using dynamical mean-field theory, it derives self-consistent equations for macroscopic mean activities and autocorrelations, together with stability criteria based on eigenvalues of a stability matrix. The phase diagram is organized into three classes: inhibition-dominated or strictly balanced networks show only quiescence or asynchronous chaos; excitation-dominated networks exhibit persistent activity with either synchronous chaos (non-vanishing mean) or coherent oscillations, where the latter onset suppresses chaotic fluctuations around the periodic mean trajectory.
Significance. If the central claims hold, this work significantly generalizes the SCS paradigm to biologically structured E-I circuits by identifying target-specific inhibition as a control parameter for transitions between chaotic and synchronous regimes. The DMFT derivation of parameter-free self-consistent equations for means and autocorrelations, combined with explicit stability criteria, provides analytical insight into chaos suppression by oscillations (reminiscent of input-induced suppression), which is a strength for the field.
major comments (2)
- [DMFT derivation and stability criteria] The self-consistent DMFT equations for autocorrelations (as described in the abstract and stability analysis) assume Gaussian fluctuation statistics remain closed in the large-N limit. When target-specific inhibition breaks E-I balance and produces nonzero mean activity, this may allow higher-order correlations to survive, potentially permitting residual chaotic attractors around the periodic orbit that are missed by the linear eigenvalue analysis; this assumption is load-bearing for the no-coexistence claim.
- [Stability analysis (eigenvalues of stability matrix)] The conclusion that 'coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory' (abstract) follows from the stability-matrix eigenvalues distinguishing mean-driven vs. fluctuation-driven instabilities, but lacks explicit verification (e.g., numerical solution of the DMFT equations or checks for multistability) when mean activity is nonzero; this is central to the three-class phase diagram.
minor comments (2)
- [Abstract] The abstract introduces 'stability-matrix eigenvalues' without a brief definition or equation reference; adding one inline would improve accessibility.
- [Phase diagram description] The phase-diagram classification would benefit from a schematic figure mapping the three regimes against parameters such as inhibition strength and target-specificity to make the boundaries concrete.
Simulated Author's Rebuttal
We thank the referee for their detailed and constructive review of our manuscript. We address the major comments below, providing clarifications on the DMFT framework and adding supporting numerical evidence where appropriate.
read point-by-point responses
-
Referee: [DMFT derivation and stability criteria] The self-consistent DMFT equations for autocorrelations (as described in the abstract and stability analysis) assume Gaussian fluctuation statistics remain closed in the large-N limit. When target-specific inhibition breaks E-I balance and produces nonzero mean activity, this may allow higher-order correlations to survive, potentially permitting residual chaotic attractors around the periodic orbit that are missed by the linear eigenvalue analysis; this assumption is load-bearing for the no-coexistence claim.
Authors: In the large-N limit, the synaptic input to each neuron consists of a sum over many independent random connections. By the central limit theorem, the distribution of these inputs converges to a Gaussian, regardless of whether the mean activity is zero or nonzero. This justifies the closure of the DMFT equations at the level of means and autocorrelations. Higher-order correlations are suppressed as 1/N and do not affect the macroscopic dynamics. The stability analysis of the fluctuation equations around the periodic orbit confirms that chaotic fluctuations are suppressed when the mean-driven oscillatory instability occurs first. We have added a paragraph in the Methods section discussing the validity of the Gaussian assumption in the unbalanced regime. revision: partial
-
Referee: [Stability analysis (eigenvalues of stability matrix)] The conclusion that 'coherent oscillations do not coexist with chaotic fluctuations around the periodic mean trajectory' (abstract) follows from the stability-matrix eigenvalues distinguishing mean-driven vs. fluctuation-driven instabilities, but lacks explicit verification (e.g., numerical solution of the DMFT equations or checks for multistability) when mean activity is nonzero; this is central to the three-class phase diagram.
Authors: We acknowledge that direct numerical verification strengthens the claim. In the revised manuscript, we have included numerical solutions of the self-consistent DMFT equations for the mean activities and autocorrelations in the excitation-dominated regime. These simulations show that in the parameter region where the oscillatory solution is stable according to the eigenvalue analysis, the autocorrelation functions exhibit periodic behavior without additional chaotic decay components. Furthermore, we performed checks by varying initial conditions and found no indications of multistability or coexisting chaotic attractors. This supports the three-class phase diagram and the suppression of chaos by oscillations. revision: yes
Circularity Check
DMFT derivation self-contained with only non-load-bearing self-citation
full rationale
The self-consistent equations for macroscopic means and autocorrelations, together with the stability-matrix eigenvalue criteria, are obtained directly from the large-N limit of the two-population rate equations via dynamical mean-field theory. No step reduces a claimed prediction to a fitted input or to a self-citation whose content is itself the target result; the organization of the phase diagram into the three classes follows from the explicit target-specific inhibitory terms without circular redefinition. Any prior citations (including to SCS) supply independent background rather than load-bearing uniqueness theorems.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption Dynamical mean-field theory accurately describes the macroscopic mean activities and autocorrelations in the thermodynamic limit
Reference graph
Works this paper leans on
-
[1]
E. R. Kandel (Ed.), Principles of neural science, 4th Edition, McGraw-Hill, New York, NY , 2000
work page 2000
- [2]
-
[3]
W. Gerstner, W. M. Kistler, R. Naud, L. Panin- ski, Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition, 1st Edition, Cambridge University Press, 2014. doi:10.1017/CBO9781107447615. URLhttps://www.cambridge.org/core/ product/identifier/9781107447615/type/ book
-
[4]
E. M. Izhikevich, Dynamical systems in neuro- science: the geometry of excitability and burst- ing, Computational neuroscience, MIT Press, Cam- bridge, Mass, 2007
work page 2007
-
[5]
C. Stringer, M. Pachitariu, N. Steinmetz, M. Carandini, K. D. Harris, High-dimensional geometry of population responses in visual cortex, Nature 571 (7765) (2019) 361–365. doi:10.1038/s41586-019-1346-5. URLhttps://www.nature.com/articles/ s41586-019-1346-5
-
[6]
H. Sompolinsky, A. Crisanti, H. J. Sommers, Chaos in Random Neural Networks, Physi- cal Review Letters 61 (3) (1988) 259–262. doi:10.1103/PhysRevLett.61.259. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.61.259
-
[7]
O. Harish, D. Hansel, Asynchronous Rate Chaos in Spiking Neuronal Circuits, PLOS Com- putational Biology 11 (7) (2015) e1004266. doi:10.1371/journal.pcbi.1004266. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1004266
-
[8]
D. Dahmen, S. Grün, M. Diesmann, M. Helias, Second type of criticality in the brain uncovers rich multiple-neuron dynamics, Proceedings of the National Academy of Sciences 116 (26) (2019) 13051–13060.doi:10.1073/pnas.1818972116. URLhttp://www.pnas.org/lookup/doi/10. 1073/pnas.1818972116
-
[9]
J. Li, W. L. Shew, Tuning network dynamics from criticality to an asynchronous state, PLOS Computational Biology 16 (9) (2020) e1008268. doi:10.1371/journal.pcbi.1008268. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1008268
-
[10]
C. Martorell, R. Calvo, A. Roig, A. Annibale, M. A. Muñoz, Ergodicity Breaking and High-Dimensional Chaos in Random Recurrent Networks (Oct. 2025). doi:10.48550/arXiv.2510.07932. URLhttp://arxiv.org/abs/2510.07932 19
-
[11]
Transition to chaos in random neuronal networks
J. Kadmon, H. Sompolinsky, Transition to chaos in random neuronal networks, Physical Review X 5 (4) (2015) 041030.doi:10.1103/PhysRevX.5. 041030. URLhttp://arxiv.org/abs/1508.06486
work page internal anchor Pith review Pith/arXiv arXiv doi:10.1103/physrevx.5 2015
-
[12]
J. Schuecker, S. Goedeke, M. Helias, Optimal Sequence Memory in Driven Random Net- works, Physical Review X 8 (4) (2018) 041029. doi:10.1103/PhysRevX.8.041029. URLhttps://link.aps.org/doi/10.1103/ PhysRevX.8.041029
-
[13]
A. Crisanti, H. Sompolinsky, Path Integral Approach to Random Neural Networks, Physical Review E 98 (6) (2018) 062120.doi:10.1103/PhysRevE. 98.062120
-
[14]
I. D. Landau, H. Sompolinsky, Coherent chaos in a recurrent neural network with structured connectivity, PLOS Computa- tional Biology 14 (12) (2018) e1006309. doi:10.1371/journal.pcbi.1006309. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1006309
-
[15]
M. Dick, A. Van Meegen, M. Helias, Link- ing network- and neuron-level correla- tions by renormalized field theory, Physi- cal Review Research 6 (3) (2024) 033264. doi:10.1103/PhysRevResearch.6.033264. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.6.033264
-
[16]
C. Martorell, R. Calvo, A. Annibale, M. A. Muñoz, Dynamically selected steady states and criticality in non-reciprocal networks, Chaos, Solitons & Fractals 182 (2024) 114809. doi:10.1016/j.chaos.2024.114809. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0960077924003618
-
[17]
J. Kadmon, Efficient Coding with Chaotic Neu- ral Networks: A Journey From Neuroscience to Physics and Back, Human Arenas (Jun. 2025). doi:10.1007/s42087-025-00507-9. URLhttps://link.springer.com/10.1007/ s42087-025-00507-9
-
[18]
D. Sherrington, S. Kirkpatrick, Solv- able Model of a Spin-Glass, Physical Re- view Letters 35 (26) (1975) 1792–1796. doi:10.1103/PhysRevLett.35.1792. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.35.1792
-
[19]
H. Nishimori, Statistical Physics of Spin Glasses and Information Processing: An In- troduction, 1st Edition, Oxford University PressOxford, 2001.doi:10.1093/acprof: oso/9780198509417.001.0001. URLhttps://academic.oup.com/book/5185
- [20]
- [21]
-
[22]
C. van Vreeswijk, H. Sompolinsky, Chaos in Neu- ronal Networks with Balanced Excitatory and In- hibitory Activity, Science 274 (5293) (1996) 1724– 1726.doi:10.1126/science.274.5293.1724. URLhttps://www.science.org/doi/10. 1126/science.274.5293.1724
-
[23]
C. V . Vreeswijk, H. Sompolinsky, Chaotic Bal- anced State in a Model of Cortical Circuits, Neural Computation 10 (6) (1998) 1321–1371. doi:10.1162/089976698300017214. URLhttps://direct.mit.edu/neco/ article/10/6/1321-1371/6179
-
[24]
N. Brunel, Dynamics of Sparsely Connected Net- works of Excitatory and Inhibitory Spiking Neurons, Journal of Computational Neuroscience 8 (2000) 183–208.doi:10.1023/a:1008925309027
-
[25]
A. Renart, J. De La Rocha, P. Bartho, L. Hollender, N. Parga, A. Reyes, K. D. Harris, The Asynchronous 20 State in Cortical Circuits, Science 327 (5965) (2010) 587–590.doi:10.1126/science.1179850. URLhttps://www.science.org/doi/10. 1126/science.1179850
-
[26]
V . Buendía, P. Villegas, S. di Santo, A. Vezzani, R. Burioni, M. A. Muñoz, Jensen’s force and the statistical mechanics of cortical asynchronous states, Scientific Reports 9 (1) (2019) 15183. doi:10.1038/s41598-019-51520-2. URLhttps://www.nature.com/articles/ s41598-019-51520-2
-
[27]
F. Mastrogiuseppe, S. Ostojic, Intrinsically- generated fluctuating activity in excitatory- inhibitory networks, PLOS Computa- tional Biology 13 (4) (2017) e1005498. doi:10.1371/journal.pcbi.1005498. URLhttps://dx.plos.org/10.1371/ journal.pcbi.1005498
-
[28]
F. Mastrogiuseppe, S. Ostojic, Linking Connectivity, Dynamics, and Computations in Low-Rank Recur- rent Neural Networks, Neuron 99 (3) (2018) 609– 623.e29.doi:10.1016/j.neuron.2018.07.003. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0896627318305439
-
[29]
F. Schuessler, A. Dubreuil, F. Mastrogiuseppe, S. Ostojic, O. Barak, Dynamics of random recur- rent networks with correlated low-rank structure, Physical Review Research 2 (1) (2020) 013111. doi:10.1103/PhysRevResearch.2.013111. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.2.013111
-
[30]
M. Beiran, A. Dubreuil, A. Valente, F. Mas- trogiuseppe, S. Ostojic, Shaping Dynamics With Multiple Populations in Low-Rank Recurrent Networks, Neural Computation 33 (6) (2021) 1572–1615.doi:10.1162/neco_a_01381. URLhttps://doi.org/10.1162/neco_a_ 01381
-
[31]
A. Dubreuil, A. Valente, M. Beiran, F. Mas- trogiuseppe, S. Ostojic, The role of population structure in computations through neural dynamics, Nature Neuroscience 25 (6) (2022) 783–794. doi:10.1038/s41593-022-01088-4. URLhttps://www.nature.com/articles/ s41593-022-01088-4
-
[32]
L. C. García Del Molino, K. Pakdaman, J. Touboul, G. Wainrib, Synchronization in random balanced networks, Physical Review E 88 (4) (2013) 042824. doi:10.1103/PhysRevE.88.042824. URLhttps://link.aps.org/doi/10.1103/ PhysRevE.88.042824
-
[33]
A. Kepecs, G. Fishell, Interneuron cell types are fit to function, Nature 505 (7483) (2014) 318–326. doi:10.1038/nature12983. URLhttps://www.nature.com/articles/ nature12983
-
[34]
R. Tremblay, S. Lee, B. Rudy, GABAergic Interneu- rons in the Neocortex: From Cellular Properties to Circuits, Neuron 91 (2) (2016) 260–292. doi:10.1016/j.neuron.2016.06.033. URLhttps://linkinghub.elsevier.com/ retrieve/pii/S0896627316303117
-
[35]
R. Corral López, V . Buendía, M. A. Muñoz, Excitatory-inhibitory branching process: A parsimonious view of cortical asynchronous states, excitability, and criticality, Physi- cal Review Research 4 (4) (2022) L042027. doi:10.1103/PhysRevResearch.4.L042027. URLhttps://link.aps.org/doi/10.1103/ PhysRevResearch.4.L042027
- [36]
-
[37]
T. P. V ogels, K. Rajan, L. Abbott, NEURAL NETWORK DYNAMICS, Annual Review of Neuroscience 28 (1) (2005) 357–376.doi: 10.1146/annurev.neuro.28.061604.135637. URLhttps://www.annualreviews.org/doi/ 10.1146/annurev.neuro.28.061604.135637
- [38]
-
[39]
F. L. Metz, Dynamical Mean-Field Theory of Complex Systems on Sparse Directed Networks, Physical Review Letters 134 (3) (2025) 037401. doi:10.1103/PhysRevLett.134.037401. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.134.037401
-
[40]
G. Benettin, L. Galgani, A. Giorgilli, J.-M. Strelcyn, Lyapunov Characteristic Exponents for smooth dynamical systems and for hamiltonian systems; A method for computing all of them. Part 2: Numer- ical application, Meccanica 15 (1) (1980) 21–30. doi:10.1007/BF02128237. URLhttp://link.springer.com/10.1007/ BF02128237
-
[41]
A. Wolf, J. B. Swift, H. L. Swinney, J. A. Vastano, Determining Lyapunov exponents from a time series, Physica D: Nonlin- ear Phenomena 16 (3) (1985) 285–317. doi:10.1016/0167-2789(85)90011-9. URLhttps://linkinghub.elsevier.com/ retrieve/pii/0167278985900119
-
[42]
A. Pikovsky, A. Politi, Lyapunov exponents: a tool to explore complex dynamics, Cambridge Univer- sity Press, Cambridge, 2016
work page 2016
-
[43]
K. Rajan, L. F. Abbott, H. Sompolinsky, Stimulus- dependent suppression of chaos in recurrent neural networks, Physical Review E 82 (1) (2010) 011903. doi:10.1103/PhysRevE.82.011903. URLhttps://link.aps.org/doi/10.1103/ PhysRevE.82.011903
-
[44]
S. H. Strogatz, Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering, 2nd Edition, CRC Press, Boca Ra- ton, 2019.doi:10.1201/9780429492563
-
[45]
J. Guckenheimer, P. Holmes, Nonlinear oscillations, dynamical systems, and bifurcations of vector fields, corrected seventh printing Edition, no. 42 in Applied mathematical sciences, Springer Science+Business Media, New York, NY , 2002
work page 2002
-
[46]
L. Cohen, Time frequency analysis, Prentice Hall signal processing series, Prentice Hall, Englewood Cliffs, NJ, 1995
work page 1995
-
[47]
J. M. Beggs, D. Plenz, Neuronal Avalanches in Neocortical Circuits, Journal of Neu- roscience 23 (35) (2003) 11167–11177. doi:10.1523/JNEUROSCI.23-35-11167.2003. URLhttps://www.jneurosci.org/content/ 23/35/11167
-
[48]
J. M. Beggs, The criticality hypothesis: how lo- cal cortical networks might optimize information processing, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineer- ing Sciences 366 (1864) (2008) 329–343.doi: 10.1098/rsta.2007.2092. URLhttps://royalsocietypublishing.org/ doi/10.1098/rsta.2007.2092
-
[49]
W. L. Shew, D. Plenz, The Functional Ben- efits of Criticality in the Cortex, The Neu- roscientist 19 (1) (2013) 88–100.doi: 10.1177/1073858412445487. URLhttps://doi.org/10.1177/ 1073858412445487
-
[50]
T. Mora, W. Bialek, Are Biological Sys- tems Poised at Criticality?, Journal of Sta- tistical Physics 144 (2) (2011) 268–302. doi:10.1007/s10955-011-0229-4. URLhttps://doi.org/10.1007/ s10955-011-0229-4
-
[51]
M. A. Muñoz, Colloquium: Criticality and dynamical scaling in living systems, Reviews of Modern Physics 90 (031001) (Jul. 2018). doi:10.1103/RevModPhys.90.031001. URLhttps://link.aps.org/doi/10.1103/ RevModPhys.90.031001
-
[52]
L. de Arcangelis, C. Perrone-Capano, H. J. Herrmann, Self-Organized Criti- cality Model for Brain Plasticity, Physi- cal Review Letters 96 (2) (2006) 028107. doi:10.1103/PhysRevLett.96.028107. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.96.028107
-
[53]
G. Deco, M. L. Kringelbach, V . K. Jirsa, P. Rit- ter, The dynamics of resting fluctuations in the 22 brain: metastability and its dynamical corti- cal core, Scientific Reports 7 (1) (2017) 3095. doi:10.1038/s41598-017-03073-5. URLhttps://www.nature.com/articles/ s41598-017-03073-5
-
[54]
J. Wilting, V . Priesemann, 25 years of criti- cality in neuroscience — established results, open controversies, novel concepts, Current Opinion in Neurobiology 58 (2019) 105–111. doi:10.1016/j.conb.2019.08.002. URLhttps://www.sciencedirect.com/ science/article/pii/S0959438819300248
-
[55]
J. O’Byrne, K. Jerbi, How critical is brain critical- ity?, Trends in Neurosciences 45 (11) (2022) 820– 837.doi:10.1016/j.tins.2022.08.007
-
[56]
K. B. Hengen, W. L. Shew, Is criticality a uni- fied setpoint of brain function?, Neuron 113 (16) (2025) 2582–2598.e2.doi:10.1016/j.neuron. 2025.05.020
-
[57]
L. Cocchi, L. L. Gollo, A. Zalesky, M. Breaks- pear, Criticality in the brain: A synthesis of neu- robiology, models and cognition, Progress in Neu- robiology 158 (2017) 132–152.doi:10.1016/j. pneurobio.2017.07.002
work page doi:10.1016/j 2017
-
[58]
T. Tao, V . Vu, M. Krishnapur, Random matri- ces: Universality of ESDs and the circular law, arXiv:0807.4898 [math] (Apr. 2009). URLhttp://arxiv.org/abs/0807.4898
work page internal anchor Pith review Pith/arXiv arXiv 2009
-
[59]
T. Tao, Outliers in the spectrum of iid matrices with bounded rank perturbations, Probability Theory and Related Fields 155 (1-2) (2013) 231–263. doi:10.1007/s00440-011-0397-9. URLhttps://link.springer.com/10.1007/ s00440-011-0397-9
-
[60]
A. Knowles, J. Yin, The outliers of a deformed Wigner matrix, The Annals of Probability 42 (5) (Sep. 2014).doi:10.1214/13-AOP855. URLhttps://projecteuclid.org/ journals/annals-of-probability/ volume-42/issue-5/ The-outliers-of-a-deformed-Wigner-matrix/ 10.1214/13-AOP855.full
-
[61]
K. Rajan, L. F. Abbott, Eigenvalue Spectra of Random Matrices for Neural Networks, Phys- ical Review Letters 97 (18) (2006) 188104. doi:10.1103/PhysRevLett.97.188104. URLhttps://link.aps.org/doi/10.1103/ PhysRevLett.97.188104
-
[62]
F. Roy, G. Biroli, G. Bunin, C. Cammarota, Nu- merical implementation of dynamical mean field theory for disordered systems: application to the Lotka–V olterra model of ecosystems, Jour- nal of Physics A: Mathematical and Theoreti- cal 52 (48) (2019) 484001.doi:10.1088/ 1751-8121/ab1f32. URLhttps://iopscience.iop.org/article/ 10.1088/1751-8121/ab1f32
-
[63]
W. Zou, H. Huang, Introduction to dynamical mean- field theory of randomly connected neural networks with bidirectionally correlated couplings, SciPost Physics Lecture Notes (2024) 79doi:10.21468/ SciPostPhysLectNotes.79. URLhttp://arxiv.org/abs/2305.08459 23 Supplemental Material Carles Martorell, Rub´ en Calvo, Alessia Annibale and Miguel ´Angel Mu˜ noz...
-
[64]
Transitions from the Quiescent State 12
-
[65]
Transition from the Asynchronous Chaos 13
-
[66]
Stability of the Response-Response Function 13
Transition from the PA to COA 13 C. Stability of the Response-Response Function 13
-
[67]
Transition from PA to SC 14 References 14 I. EXCIT A TOR Y AND INHIBITOR Y FIRING RA TE SYSTEM The general system has two populations of, respectively,fNexcitatory (E) and (1−f)Ninhibitory (I) neurons, beingfthe fraction of the excitatory population. By denotingx i(t) (i= 1,...,fN) andy k(t) (k=fN+ 1,...,N) the time-dependent rate of excitatory and inhibi...
work page 2000
-
[68]
Transitions from the Quiescent State For fixed points, such thatC(τ)≡q, for anyτ, the Jacobian matrixJreduces to simple equations. When the quiescent solution is considered, described byM= 0 andq= 0, the Jacobian matrix reduces to a diagonal-block: JMQ =J QM = 0, while JMM =gJ 0Nβ,δ, J QQ = (gJ) 2Mβ,δ.(67) 13 The eigenvalues of these matrices are, respect...
-
[69]
Transition from the Asynchronous Chaos The transition from AC phase to SC or COA phase emerges as a bifurcation of the mean activity,M; therefore, the transition can be computed by analyzing the stability of the AC state, described byM= 0 andC(0) =C c 0, against a small perturbation, whereC c 0 describes the zero-lag autocorrelator selected by the system ...
-
[70]
Transition from the PA to COA The transition from PA to COA phase also appears as a bifurcation of the mean activity,M. Hence, an identical analysis to that done for the AC-COA/SC can be performed obtaining the same criterion, but assuming fixed-point solutions:MandC(0) =q. C. Stability of the Response-Response F unction The phase transition from PA to SC...
-
[71]
Transition from PA to SC The transition from PA to SC phase is hence described by the second-order response of the autocorrelation. As- suming that the (unperturbed) state is a fixed point,M̸= 0 andq̸= 0, the response against perturbations of the autocorrelator is governed by matrix: (gJ) 2Dϕ′,ϕ′Mβ,δ−Id.(78) where [ Dϕ′,ϕ′ ] αβ= ⟨ ϕ′(zα)2⟩ δαβ (79) The st...
-
[72]
J. Kadmon and H. Sompolinsky, Transition to chaos in random neuronal networks, Physical Review X5, 041030 (2015)
work page 2015
-
[73]
O. Harish and D. Hansel, Asynchronous Rate Chaos in Spiking Neuronal Circuits, PLOS Computational Biology11, e1004266 (2015)
work page 2015
-
[74]
F. Mastrogiuseppe and S. Ostojic, Intrinsically-generated fluctuating activity in excitatory-inhibitory networks, PLOS Com- putational Biology13, e1005498 (2017). 15
work page 2017
-
[75]
H. Sompolinsky, A. Crisanti, and H. J. Sommers, Chaos in Random Neural Networks, Physical Review Letters61, 259 (1988)
work page 1988
-
[76]
J. Schuecker, S. Goedeke, and M. Helias, Optimal Sequence Memory in Driven Random Networks, Physical Review X8, 041029 (2018)
work page 2018
-
[77]
M. Helias and D. Dahmen,Statistical Field Theory for Neural Networks, Lecture Notes in Physics, Vol. 970 (Springer International Publishing, Cham, 2020)
work page 2020
-
[78]
F. Mastrogiuseppe and S. Ostojic, Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks, Neuron99, 609 (2018)
work page 2018
-
[79]
C. Martorell, R. Calvo, A. Roig, A. Annibale, and M. A. Mu˜ noz, Ergodicity Breaking and High-Dimensional Chaos in Random Recurrent Networks (2025)
work page 2025
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.