pith. machine review for the scientific record. sign in

arxiv: 2605.03338 · v1 · submitted 2026-05-05 · 💻 cs.NE · math.DS

Recognition: unknown

Symmetry-Protected Lyapunov Neutral Modes in Equivariant Recurrent Networks

Hanson Hanxuan Mo

Pith reviewed 2026-05-09 16:42 UTC · model grok-4.3

classification 💻 cs.NE math.DS
keywords equivariant recurrent networksLyapunov exponentsgroup orbitsneutral modessymmetry protectionpath integrationLie groupsdynamical systems
0
0 comments X

The pith

Exact equivariance under a Lie group forces at least dim(G/H) zero Lyapunov exponents tangent to the group orbit in compact invariant sets.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper proves that when a finite-dimensional autonomous flow respects exact symmetry under a Lie group, the geometry of the group orbits automatically produces neutral directions with zero growth rates. These directions sit in the tangent space to the orbits and require no parameter tuning to stay neutral. In recurrent networks this supplies a built-in mechanism for preserving continuous variables such as position or phase over long times. The guarantee holds at every point where the Lyapunov spectrum exists, provided the invariant set carries a uniformly nondegenerate orbit bundle with fixed stabilizer.

Core claim

For a finite-dimensional autonomous C^1 vector field equivariant under a Lie group G, any compact invariant set carrying a uniformly nondegenerate group-orbit bundle with stabilizer type H has, at points where the Lyapunov spectrum is defined, at least dim(G/H) zero Lyapunov exponents tangent to the group orbit. These symmetry-protected modes have zero group-tangent growth because of exact equivariance and orbit geometry.

What carries the argument

the tangent space to the group orbit inside an equivariant vector field, which remains neutral by orbit geometry and exact equivariance

If this is right

  • When exact equivariance is deliberately broken, the protected direction develops a measurable pseudo-gap that correlates with finite memory lifetime in recurrent networks.
  • An exactly equivariant recurrent cell trained on S1 path integration achieves step-equivariance error of 3.2 times 10 to the minus 8 and a near-zero group-tangent exponent under autonomous zero-input flow.
  • The count of zero group-tangent exponents scales directly with orbit dimension across S1, T^q, SO(n), U(m), and product groups in both synthetic and trained systems.
  • Principal-angle alignment between learned subspaces and group tangents, together with autonomous-flow-zero controls, confirms that the neutral modes arise from symmetry rather than from task-specific tuning.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same orbit-tangent protection could be used as a diagnostic to check whether a trained recurrent model has retained continuous-state capacity even after fine-tuning.
  • Relaxing exact equivariance to controlled approximate symmetry while monitoring the size of the resulting pseudo-gap may give a practical route to trade strict protection for greater flexibility.
  • The result suggests testing whether biological circuits that maintain phase or position information rely on analogous symmetry-protected neutral modes.

Load-bearing premise

The vector field must be exactly equivariant under the group and the compact invariant set must carry a uniformly nondegenerate orbit bundle with fixed stabilizer type.

What would settle it

A numerical example of an exactly equivariant C^1 vector field whose compact invariant set shows fewer than dim(G/H) zero exponents in the group-tangent directions at a point where the full Lyapunov spectrum is defined.

Figures

Figures reproduced from arXiv: 2605.03338 by Hanson Hanxuan Mo.

Figure 1
Figure 1. Figure 1: Dimension-law evidence for exact continuous-symmetry models. (A) Product￾torus systems show observed near-zero counts matching the expected orbit dimension dim(G/H) = q. (B) The same count law holds for SO(n)/SO(n − 1), U(m)/U(m − 1), and product-group examples. (C) Representative spectra show a neutral block followed by stable transverse exponents. Near-zero counts use the experiment-specific thresholds r… view at source ↗
Figure 2
Figure 2. Figure 2: Neutral-subspace geometry and multiplicity beyond the flow direction. (A) Principal-angle diagnostics align numerical neutral subspaces with analytical group tangents; angles are plotted directly in degrees with explicit 10−7 -scale tick labels. (B) Product-group and higher￾dimensional examples show the expected q group-tangent directions independent of any single autonomous-flow direction. The relative-eq… view at source ↗
Figure 3
Figure 3. Figure 3: Coupled equivariant RNN-style branch and broken control. The exact branch uses the weighted S 1 representation ρθ(z, w, h) = (e iθz, e2iθw, h), so the charge-one component z, charge-two component w, and invariant hidden state h transform in a prescribed way under phase shifts. The broken control adds a non-equivariant phase-pinning perturbation to the same recurrent branch, making explicit which part of th… view at source ↗
Figure 4
Figure 4. Figure 4: Symmetry breaking opens pseudo-gaps. (A) Measured lifetimes match predicted gap-controlled lifetimes. (B) Memory lifetime decreases as breaking magnitude ϵ increases. (C) In random anisotropic breaking, measured symmetry-direction exponents track perturbative predictions and scale with equivariance error. These panels support the pseudo-gap consequence for the explicit breaking families tested here. 50 100… view at source ↗
Figure 5
Figure 5. Figure 5: End-to-end learned equivariant path integration. (A) Horizon extrapolation compares the exact equivariant cell with a broken equivariant control and matched GRU, LSTM, and orthogonal-RNN baselines after training with horizon 64 and in-distribution speed. (B) Speed out-of-distribution evaluation averages held-out velocity scales at test horizon 128. (C) Restricted￾phase training tests generalization from a … view at source ↗
read the original abstract

Recurrent networks that store position, phase, or other continuous variables need state-space directions that remain neutral over long horizons. We give a symmetry-based account of when such neutral directions are guaranteed rather than merely tuned. For a finite-dimensional autonomous \(C^1\) vector field equivariant under a Lie group \(G\), we prove that any compact invariant set carrying a uniformly nondegenerate group-orbit bundle with stabilizer type \(H\) has, at points where the Lyapunov spectrum is defined, at least \(\dim(G/H)\) zero Lyapunov exponents tangent to the group orbit. These symmetry-protected modes have zero group-tangent growth because of exact equivariance and orbit geometry. When this protection is explicitly broken, the formerly protected direction can acquire a pseudo-gap; in our controlled breaking experiments this pseudo-gap predicts finite memory lifetime. We verify the finite-dimensional consequences with normalized equivariance error, direct group-tangent exponents, principal-angle alignment, autonomous-flow-zero controls, and orbit-dimension scaling across \(S^1\), \(T^q\), \(SO(n)\), \(U(m)\), product-group, and coupled equivariant RNN-style systems. We also train an exactly equivariant recurrent cell on velocity-input \(S^1\) path integration across six seeds and compare it with matched GRU, LSTM, and orthogonal-RNN baselines. The learned equivariant cell preserves step equivariance to \(3.2\times10^{-8}\), has a near-zero group-tangent exponent under the zero-input autonomous restriction, and improves horizon, speed, and restricted-phase generalization in this matched protocol. The learned task results are consequence evidence; the theorem-level evidence remains exact equivariance, group-tangent exponents, orbit-dimension scaling, and tangent-subspace alignment.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 2 minor

Summary. The manuscript proves that for a finite-dimensional autonomous C^1 vector field equivariant under a Lie group G, any compact invariant set carrying a uniformly nondegenerate group-orbit bundle with stabilizer type H possesses at least dim(G/H) zero Lyapunov exponents tangent to the group orbit at points where the spectrum is defined. These symmetry-protected neutral modes arise directly from equivariance and orbit geometry. The paper verifies the finite-dimensional consequences via normalized equivariance error, direct group-tangent exponents, principal-angle alignment, autonomous-flow-zero controls, and orbit-dimension scaling across S^1, T^q, SO(n), U(m), product groups, and coupled equivariant RNN-style systems. It further trains an exactly equivariant recurrent cell on velocity-input S^1 path integration, showing preservation of step equivariance to 3.2e-8, near-zero group-tangent exponent under autonomous restriction, and improved horizon/speed/generalization relative to matched GRU, LSTM, and orthogonal-RNN baselines; discrete RNN experiments are presented as consequence checks rather than direct theorem applications.

Significance. If the central theorem holds, the work supplies a symmetry-based guarantee for the existence of neutral directions in recurrent network state spaces without parameter tuning, directly relevant to modeling continuous variables such as position or phase. The explicit use of exact equivariance preservation, orbit-dimension scaling, and tangent-subspace alignment as verification methods, together with the controlled symmetry-breaking experiments linking pseudo-gaps to memory lifetime, constitutes a strength. The result bridges equivariant dynamics and RNN design in a falsifiable manner.

minor comments (2)
  1. [Abstract and §4 (Experiments)] The abstract states that discrete RNN experiments serve as 'consequence checks'; the main text should explicitly delineate which experimental controls (e.g., autonomous-flow-zero) directly test the theorem versus which test downstream task performance, to avoid any appearance of overclaiming direct applicability.
  2. [§5 (Path Integration Results)] The path-integration comparison reports improvements across six seeds; the manuscript should include per-seed variance or statistical tests for the horizon, speed, and restricted-phase metrics in the main text or a supplementary table to strengthen the empirical claim.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for the positive and accurate summary of our manuscript, as well as the favorable significance assessment. The recommendation for minor revision is noted. No specific major comments appear in the report, so we provide no point-by-point rebuttals below. We will incorporate any minor editorial or presentational suggestions in the revised version.

Circularity Check

0 steps flagged

No significant circularity

full rationale

The paper's central claim is a theorem proved from the definition of equivariance (the vector field commutes with the group action) together with standard facts about the tangent bundle to a group orbit. The zero Lyapunov exponents follow directly because the linearized flow maps orbit-tangent vectors to orbit-tangent vectors with multiplier 1. This reduction is external to any fitted parameters or self-referential definitions inside the paper. The RNN experiments are presented only as consequence checks that verify the finite-dimensional implications; they do not supply the proof or close any derivation loop. No load-bearing step reduces to its own inputs by construction, and no self-citation chain is invoked for the uniqueness or existence of the neutral modes.

Axiom & Free-Parameter Ledger

0 free parameters · 3 axioms · 0 invented entities

The result rests on standard assumptions from smooth dynamical systems and Lie-group geometry; no free parameters are introduced and no new entities are postulated.

axioms (3)
  • standard math The vector field is finite-dimensional, autonomous, and C^1.
    Required for the Lyapunov spectrum to be defined at the relevant points.
  • domain assumption The system is exactly equivariant under the Lie group G.
    Exact equivariance is the source of the protection for the neutral modes.
  • domain assumption The compact invariant set carries a uniformly nondegenerate group-orbit bundle with stabilizer type H.
    This orbit structure is needed to guarantee the zero exponents are tangent to the orbit.

pith-pipeline@v0.9.0 · 5609 in / 1440 out tokens · 48196 ms · 2026-05-09T16:42:13.962379+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

30 extracted references · 6 canonical work pages · 1 internal anchor

  1. [1]

    Seung, H. S. , title =. Proceedings of the National Academy of Sciences of the United States of America , volume =. 1996 , doi =

  2. [2]

    Seung, H. S. , title =. Neural Networks , volume =. 1998 , doi =

  3. [3]

    , title =

    Burak, Yoram and Fiete, Ila R. , title =. PLOS Computational Biology , volume =. 2009 , doi =

  4. [4]

    and Wei, Xue-Xin , title =

    Cueva, Christopher J. and Wei, Xue-Xin , title =. International Conference on Learning Representations , year =. 1803.07770 , archivePrefix =

  5. [5]

    Banino, Andrea and Barry, Caswell and Uria, Benigno and Blundell, Charles and Lillicrap, Timothy and Mirowski, Piotr and Pritzel, Alexander and Chadwick, Martin J. and Degris, Thomas and Modayil, Joseph and Wayne, Greg and Soyer, Hubert and Viola, Fabio and Zhang, Brian and Goroshin, Ross and Rabinowitz, Neil and Pascanu, Razvan and Beattie, Charles and P...

  6. [6]

    and Ocko, Samuel A

    Sorscher, Ben and Mel, Gabriel C. and Ocko, Samuel A. and Giocomo, Lisa M. and Ganguli, Surya , title =. Neuron , volume =. 2023 , doi =

  7. [7]

    arXiv preprint arXiv:2210.02684 , year =

    Xu, Dehong and Gao, Ruiqi and Zhang, Wen-Hao and Wei, Xue-Xin and Wu, Ying Nian , title =. arXiv preprint arXiv:2210.02684 , year =. doi:10.48550/arXiv.2210.02684 , url =. 2210.02684 , archivePrefix =

  8. [8]

    arXiv preprint arXiv:2511.04802 , year =

    Di Bernardo, Arianna and Valente, Adrian and Mastrogiuseppe, Francesca and Ostojic, Srdjan , title =. arXiv preprint arXiv:2511.04802 , year =. doi:10.48550/arXiv.2511.04802 , url =. 2511.04802 , archivePrefix =

  9. [9]

    and Crisanti, A

    Sompolinsky, H. and Crisanti, A. and Sommers, H. J. , title =. Physical Review Letters , volume =. 1988 , doi =

  10. [10]

    and Galgani, L

    Benettin, G. and Galgani, L. and Giorgilli, A. and Strelcyn, J.-M. , title =. Meccanica , volume =. 1980 , doi =

  11. [11]

    and Swift, J

    Wolf, A. and Swift, J. B. and Swinney, H. L. and Vastano, J. A. , title =. Physica D: Nonlinear Phenomena , volume =. 1985 , doi =

  12. [12]

    and Poggi, P

    Ginelli, F. and Poggi, P. and Turchi, A. and Chat. Characterizing Dynamics with Covariant Lyapunov Vectors , journal =. 2007 , doi =

  13. [13]

    , title =

    Golubitsky, Martin and Stewart, Ian and Schaeffer, David G. , title =. 1988 , doi =

  14. [14]

    SIAM Journal on Mathematical Analysis , volume =

    Krupa, Martin , title =. SIAM Journal on Mathematical Analysis , volume =. 1990 , doi =

  15. [15]

    , title =

    Goldstone, J. , title =. Il Nuovo Cimento , volume =. 1961 , doi =

  16. [16]

    and Salam, A

    Goldstone, J. and Salam, A. and Weinberg, S. , title =. Physical Review , volume =. 1962 , doi =

  17. [17]

    Oseledets, V. I. , title =. Transactions of the Moscow Mathematical Society , volume =

  18. [18]

    Advances in Neural Information Processing Systems , volume =

    Engelken, Rainer , title =. Advances in Neural Information Processing Systems , volume =. 2023 , url =. 2312.17306 , archivePrefix =

  19. [19]

    Learning Phrase Representations Using

    Cho, Kyunghyun and van Merri. Learning Phrase Representations Using. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing , pages =. 2014 , doi =

  20. [20]

    Proceedings of the 33rd International Conference on Machine Learning , series =

    Cohen, Taco and Welling, Max , title =. Proceedings of the 33rd International Conference on Machine Learning , series =. 2016 , url =

  21. [21]

    Proceedings of the 35th International Conference on Machine Learning , series =

    Kondor, Risi and Trivedi, Shubhendu , title =. Proceedings of the 35th International Conference on Machine Learning , series =. 2018 , url =

  22. [22]

    Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

    Bronstein, Michael M. and Bruna, Joan and Cohen, Taco and Veli. Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges , journal =. 2021 , doi =. 2104.13478 , archivePrefix =

  23. [23]

    Long Short-Term Memory , journal =

    Hochreiter, Sepp and Schmidhuber, J. Long Short-Term Memory , journal =. 1997 , doi =

  24. [24]

    Proceedings of the 33rd International Conference on Machine Learning , series =

    Henaff, Mikael and Szlam, Arthur and LeCun, Yann , title =. Proceedings of the 33rd International Conference on Machine Learning , series =. 2016 , url =

  25. [25]

    Proceedings of the 33rd International Conference on Machine Learning , series =

    Arjovsky, Martin and Shah, Amar and Bengio, Yoshua , title =. Proceedings of the 33rd International Conference on Machine Learning , series =. 2016 , url =

  26. [26]

    Discrete and Continuous Dynamical Systems , volume =

    Rumberger, Matthias , title =. Discrete and Continuous Dynamical Systems , volume =. 2001 , doi =

  27. [27]

    Back to the Continuous Attractor , booktitle =

    S. Back to the Continuous Attractor , booktitle =. 2024 , url =

  28. [28]

    Cell Reports , volume =

    Darshan, Ran and Rivkind, Alexander , title =. Cell Reports , volume =. 2022 , doi =

  29. [29]

    Anderson , title =

    Keller, T. Anderson , title =. arXiv preprint arXiv:2507.14793 , year =. doi:10.48550/arXiv.2507.14793 , url =. 2507.14793 , archivePrefix =

  30. [30]

    Symmetry-Regularized Learning of Continuous Attractor Dynamics , booktitle =

    Liang, Arthur and S. Symmetry-Regularized Learning of Continuous Attractor Dynamics , booktitle =. 2025 , note =