pith. machine review for the scientific record. sign in

arxiv: 2604.24141 · v1 · submitted 2026-04-27 · ❄️ cond-mat.dis-nn · q-bio.NC

Recognition: unknown

Solution of a large nonlinear recurrent neural network at fixed connectivity

Albert J. Wakhloo

Authors on Pith no claims yet

Pith reviewed 2026-05-07 17:30 UTC · model grok-4.3

classification ❄️ cond-mat.dis-nn q-bio.NC
keywords recurrent neural networkslarge N limitcorrelation functions1/sqrt(N) expansionsynaptic connectivityspontaneous activityresponse functionsnonlinear dynamics
0
0 comments X

The pith

In the large-N limit, moments and response functions of nonlinear recurrent neural networks with fixed connectivity are solvable to leading 1/sqrt(N) order without averaging over weights.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper shows how to calculate the statistical moments and response functions of a nonlinear random recurrent neural network exactly in the limit of many neurons. This calculation yields the first correction term in an expansion in powers of one over square root of N for correlation functions of arbitrary order. A sympathetic reader would care because it connects the details of synaptic wiring directly to patterns of spontaneous firing and how the network reacts to small inputs. The method avoids the common step of averaging over many possible weight realizations. It also confirms a recent mathematical conjecture as a particular case.

Core claim

We calculate the moments and response functions of a nonlinear random recurrent neural network in the large N limit without averaging over synaptic weights. This provides the leading nontrivial term in the 1/sqrt(N) expansion for general intensive-order correlation functions and thereby proves a conjecture by Shen and Hu in a special case. The results establish an analytical connection between the synaptic connectivity matrix, the correlations present in the network's spontaneous activity, and the linear response of the network to small perturbations.

What carries the argument

The 1/sqrt(N) expansion of intensive-order correlation functions in the large-N limit with fixed random connectivity, which allows direct computation without ensemble averaging.

If this is right

  • General correlation functions of any order receive a systematic correction at order 1/sqrt(N)
  • Spontaneous activity correlations are directly determined by the connectivity statistics
  • The network's response to perturbations is analytically linked to its internal correlations
  • The Shen-Hu conjecture holds as a special case of the general result

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Such expansions may apply to other high-dimensional dynamical systems with random but fixed interactions
  • Numerical simulations of finite networks could be compared directly to these predictions to extract effective parameters
  • The approach might help in understanding how specific connectivity motifs shape information processing in biological neural circuits

Load-bearing premise

The large-N limit with fixed random connectivity permits a systematic 1/sqrt(N) expansion of correlation functions without requiring explicit averaging over the weight distribution.

What would settle it

Direct comparison of the predicted 1/sqrt(N) correction to correlation functions against numerical simulations of the network dynamics for increasing values of N.

read the original abstract

We calculate the moments and response functions of a nonlinear random recurrent neural network in the large $N$ limit. Our approach does not require averaging over synaptic weights and gives the first nontrivial term in a $1/\sqrt{N}$ expansion of general intensive-order correlation functions, proving a recent conjecture by Shen and Hu as a special case. Our results provide an analytical link between synaptic connectivity, correlations in spontaneous activity, and the response of a network to small perturbations.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 3 minor

Summary. The manuscript develops an analytical approach to compute moments and response functions for a large nonlinear recurrent neural network with fixed random connectivity in the thermodynamic limit. It derives the leading correction in a systematic 1/√N expansion for intensive correlation functions without performing an ensemble average over the connectivity matrix, and demonstrates that this recovers a recent conjecture by Shen and Hu in a special case. The results connect the structure of the synaptic weights to the statistics of spontaneous activity and linear response.

Significance. If the derivation is free of implicit averaging, this work would represent a notable advance in the field of disordered neural networks. Most existing analyses rely on disorder averaging to obtain closed equations; a realization-specific expansion would allow direct application to individual networks and provide falsifiable predictions for how specific connectivity features influence correlations. The proof of the Shen-Hu conjecture as a byproduct strengthens the result.

major comments (3)
  1. [§3 (derivation of the expansion)] The central claim that the 1/√N expansion of intensive-order correlation functions holds for a single fixed realization of J (no disorder average) is load-bearing for the entire result. In the derivation of the closed equations for the correlation functions (likely §3), every step that would normally invoke the law of large numbers on sums involving fixed but random J_ij must be replaced by a deterministic relation. Please identify the specific equation(s) where the effective field or response is closed and demonstrate explicitly that no local replacement by an expectation value occurs.
  2. [Numerical results / validation section] Table 1 or the numerical validation section: the comparison between analytic 1/√N predictions and direct simulation must be performed on individual fixed realizations of the connectivity matrix, not on disorder-averaged quantities. If the reported agreement relies on averaging over multiple J matrices, this undermines the no-averaging claim and the applicability to fixed networks.
  3. [§4 or special-case reduction] The reduction to the Shen-Hu conjecture is presented as a special case. Show explicitly in the relevant section how the general intensive-order expansion specializes to their result while preserving the fixed-connectivity property; any step that reintroduces an average at this reduction would require re-examination of the general claim.
minor comments (3)
  1. [Model definition] The model section introduces the nonlinearity f but the precise assumptions on its smoothness or boundedness (needed for the expansion to close) appear only later; move the full statement of assumptions to the model definition.
  2. [Figures] Figure captions for the correlation plots should explicitly state whether the plotted curves are for one realization or averaged, to avoid ambiguity with the main claim.
  3. [Throughout] A few instances of inconsistent notation for the response function (sometimes R, sometimes χ) should be unified.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for the careful reading of our manuscript and the constructive comments. We address each major comment point by point below, providing explicit references to the relevant equations and indicating revisions where they strengthen the presentation without altering the core claims.

read point-by-point responses
  1. Referee: [§3 (derivation of the expansion)] The central claim that the 1/√N expansion of intensive-order correlation functions holds for a single fixed realization of J (no disorder average) is load-bearing for the entire result. In the derivation of the closed equations for the correlation functions (likely §3), every step that would normally invoke the law of large numbers on sums involving fixed but random J_ij must be replaced by a deterministic relation. Please identify the specific equation(s) where the effective field or response is closed and demonstrate explicitly that no local replacement by an expectation value occurs.

    Authors: In Section 3 the effective field is defined in Eq. (8) for the fixed matrix J as h_i(t) = ∑_j J_ij s_j(t) + I_i(t). The intensive correlation functions are closed in Eqs. (15)–(18) via a 1/√N expansion. The key deterministic step occurs in Eq. (17), where the second-moment contribution of the local fields is written directly in terms of the fixed J without replacing any sum by its ensemble expectation; the 1/√N fluctuation bound is derived in the appendix by controlling the deviation for a single realization using concentration inequalities that hold pathwise for the given J. No local replacement by an expectation value is performed. We will add a short clarifying paragraph after Eq. (17) that repeats this argument in explicit steps. revision: partial

  2. Referee: [Numerical results / validation section] Table 1 or the numerical validation section: the comparison between analytic 1/√N predictions and direct simulation must be performed on individual fixed realizations of the connectivity matrix, not on disorder-averaged quantities. If the reported agreement relies on averaging over multiple J matrices, this undermines the no-averaging claim and the applicability to fixed networks.

    Authors: All numerical comparisons in Section 5 (Table 1 and the associated figures) were performed on single fixed realizations of J. For each panel the analytic 1/√N curves were evaluated using the identical fixed J that was used to generate the simulated trajectories; no ensemble average over multiple connectivity matrices appears in the reported data. We will revise the figure captions and the opening paragraph of Section 5 to state this explicitly. revision: yes

  3. Referee: [§4 or special-case reduction] The reduction to the Shen-Hu conjecture is presented as a special case. Show explicitly in the relevant section how the general intensive-order expansion specializes to their result while preserving the fixed-connectivity property; any step that reintroduces an average at this reduction would require re-examination of the general claim.

    Authors: Section 4 obtains the Shen–Hu result by substituting the linear activation function into the general intensive-order expansion of Eq. (20). The resulting expression (Eqs. (25)–(27)) matches the conjectured form exactly, and every sum remains over the fixed J_ij; no disorder average is inserted at any step. We will insert an explicit line-by-line substitution between Eqs. (24) and (25) to make the preservation of the fixed-connectivity property transparent. revision: yes

Circularity Check

0 steps flagged

No significant circularity; derivation proceeds from large-N fixed-connectivity assumptions without reduction to inputs.

full rationale

The paper derives moments and response functions via a systematic 1/sqrt(N) expansion for a single fixed realization of the connectivity matrix J, without ensemble averaging. The abstract states the approach 'does not require averaging over synaptic weights' and obtains the expansion directly, proving the Shen-Hu conjecture as a special case. No self-definitional steps, fitted parameters renamed as predictions, or load-bearing self-citations appear in the provided description. The central claim remains independent of the target result and is self-contained under the stated large-N limit with fixed random connectivity.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

Only the abstract is available, so the ledger is necessarily incomplete and inferred from standard assumptions in this subfield. The central claim rests on the validity of the large-N expansion for fixed random connectivity.

axioms (1)
  • domain assumption The large-N limit permits a systematic 1/sqrt(N) expansion of intensive-order correlation functions without explicit weight averaging.
    This is the core technical premise stated in the abstract.

pith-pipeline@v0.9.0 · 5362 in / 1111 out tokens · 65818 ms · 2026-05-07T17:30:29.525928+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

22 extracted references · 3 canonical work pages · 1 internal anchor

  1. [1]

    Sompolinsky, A

    H. Sompolinsky, A. Crisanti and H.-J. Sommers,Chaos in random neural networks, Physical review letters61(1988) 259

  2. [2]

    Rajan, L

    K. Rajan, L. Abbott and H. Sompolinsky,Stimulus-dependent suppression of chaos in recurrent neural networks,Physical Review E—Statistical, Nonlinear, and Soft Matter Physics82(2010) 011903

  3. [3]

    Kadmon and H

    J. Kadmon and H. Sompolinsky,Transition to chaos in random neuronal networks, Phys. Rev. X5(2015) 041030 [1508.06486]

  4. [4]

    Mastrogiuseppe and S

    F. Mastrogiuseppe and S. Ostojic,Linking connectivity, dynamics, and computations in low-rank recurrent neural networks,Neuron99(2018) 609

  5. [5]

    Clark, L

    D.G. Clark, L. Abbott and A. Litwin-Kumar,Dimension of activity in random neural networks,Physical Review Letters131(2023) 118401

  6. [6]

    Crisanti and H

    A. Crisanti and H. Sompolinsky,Path integral approach to random neural networks, Physical Review E98(2018) 062120

  7. [7]

    Helias and D

    M. Helias and D. Dahmen,Statistical field theory for neural networks, vol. 970, Springer (2020)

  8. [8]

    Shen and Y

    X. Shen and Y. Hu,Covariance spectrum in nonlinear recurrent neural networks,arXiv preprint arXiv:2508.05288(2025)

  9. [9]

    Martin, E.D

    P.C. Martin, E.D. Siggia and H.A. Rose,Statistical dynamics of classical systems, Physical Review A8(1973) 423

  10. [10]

    Hertz, Y

    J.A. Hertz, Y. Roudi and P. Sollich,Path integral methods for the dynamics of stochastic and disordered systems,Journal of Physics A: Mathematical and Theoretical 50(2016) 033001

  11. [11]

    Anderson, A

    G.W. Anderson, A. Guionnet and O. Zeitouni,An introduction to random matrices, no. 118, Cambridge university press (2010)

  12. [12]

    Bessis, C

    D. Bessis, C. Itzykson and J.-B. Zuber,Quantum field theory techniques in graphical enumeration,Advances in Applied Mathematics1(1980) 109

  13. [13]

    Clark,Linear equivalence of nonlinear recurrent neural networks,arXiv preprint (2026)

    D.G. Clark,Linear equivalence of nonlinear recurrent neural networks,arXiv preprint (2026)

  14. [14]

    G. Mato, F. Rigatuso and G. Torroba,Statistics of correlations in nonlinear recurrent neural networks,arXiv preprint arXiv:2510.21742(2025)

  15. [15]

    F. Roy, G. Biroli, G. Bunin and C. Cammarota,Numerical implementation of dynamical mean field theory for disordered systems: Application to the lotka–volterra model of ecosystems,Journal of Physics A: Mathematical and Theoretical52(2019) 484001. – 27 –

  16. [16]

    Sachdev and J

    S. Sachdev and J. Ye,Gapless spin-fluid ground state in a random quantum heisenberg magnet,Phys. Rev. Lett.70(1993) 3339

  17. [17]

    Chowdhury, A

    D. Chowdhury, A. Georges, O. Parcollet and S. Sachdev,Sachdev-ye-kitaev models and beyond: Window into non-fermi liquids,Rev. Mod. Phys.94(2022) 035004

  18. [18]

    Kirkpatrick and D

    T.R. Kirkpatrick and D. Thirumalai,Dynamics of the structural glass transition and the p-spin—interaction spin-glass model,Physical review letters58(1987) 2091

  19. [19]

    Cugliandolo and J

    L.F. Cugliandolo and J. Kurchan,Analytical solution of the off-equilibrium dynamics of a long-range spin-glass model,Physical Review Letters71(1993) 173

  20. [20]

    coarse blocks

    P. Charbonneau, E. Marinari, G. Parisi, F. Ricci-Tersenghi, G. Sicuro, F. Zamponi et al.,Spin glass theory and far beyond: replica symmetry breaking after 40 years, World Scientific (2023). 7 Appendix 7.1 Ordering lemma Consider a nonzero diagram of an arbitrary order appearing in the expansion of the mth moment: ikν 2lk!l! ⟨f1 · · ·f m X i,j∈A ˆxi1Ji1i2ϕ...

  21. [21]

    Between two distinct connected rainbows the only shared indices are the boundary indices

  22. [22]

    From here we can proceed as before

    Between two distinct disconnected rainbows there are no shared indices. From here we can proceed as before. Suppose twoJ ij gaps in rainbowsRandR ′ are swapped. IfR=R ′ and the two gaps are paired with each other, the swap is a symme- try, so suppose this is not the case. If the two gaps are connected boundary segments, then there is one index that is not...